FileToFileContent
Converts local files into FileContent objects that can be embedded into ChatMessage objects and passed to an LLM.
Basic Information
- Type:
haystack.components.converters.file_to_file_content.FileToFileContent - Components it can connect with:
FileTypeRouter:FileToFileContentcan receive files fromFileTypeRouter.ChatPromptBuilder:FileToFileContentcan sendFileContentobjects toChatPromptBuilderfor inclusion in chat messages.
Inputs
| Parameter | Type | Default | Description |
|---|---|---|---|
| sources | List[Union[str, Path, ByteStream]] | List of file paths or ByteStream objects to convert. | |
| extra | Optional[Union[Dict[str, Any], List[Dict[str, Any]]]] | None | Optional extra information to attach to the FileContent objects. Can be used to store provider-specific information. Values should be JSON serializable. This value can be a list of dictionaries or a single dictionary. If it's a single dictionary, its content is added to the extra of all produced FileContent objects. If it's a list, its length must match the number of sources as they're zipped together. |
Outputs
| Parameter | Type | Default | Description |
|---|---|---|---|
| file_contents | List[FileContent] | A list of FileContent objects created from the input files. |
Overview
FileToFileContent converts local files into FileContent objects. These objects contain the base64-encoded file data, MIME type, and filename. You can embed them into ChatMessage objects to pass files directly to an LLM, for example, to ask the LLM questions about a PDF or an image.
The component automatically detects the MIME type of each file. Empty files are skipped with a warning.
Usage Example
Using the Component in a Pipeline
In this pipeline, FileToFileContent converts files and passes them to a chat generator through a ChatPromptBuilder:
components:
FileToFileContent:
type: haystack.components.converters.file_to_file_content.FileToFileContent
init_parameters: {}
ChatPromptBuilder:
type: haystack.components.builders.chat_prompt_builder.ChatPromptBuilder
init_parameters:
template:
- _content:
- text: "Analyze the following files and answer questions about them."
_role: system
- _content:
- text: "{{ query }}"
_role: user
required_variables:
variables:
ChatGenerator:
type: haystack.components.generators.chat.openai.OpenAIChatGenerator
init_parameters:
model: gpt-4o
api_key:
type: env_var
env_vars:
- OPENAI_API_KEY
strict: false
connections:
- sender: FileToFileContent.file_contents
receiver: ChatPromptBuilder.file_contents
- sender: ChatPromptBuilder.prompt
receiver: ChatGenerator.messages
inputs:
files:
- FileToFileContent.sources
query:
- ChatPromptBuilder.query
outputs:
replies: ChatGenerator.replies
max_runs_per_component: 100
metadata: {}
Parameters
Init Parameters
This component has no init parameters.
Run Method Parameters
These are the parameters you can configure for the component's run() method. This means you can pass these parameters at query time through the API, in Playground, or when running a job. For details, see Modify Pipeline Parameters at Query Time.
| Parameter | Type | Default | Description |
|---|---|---|---|
| sources | List[Union[str, Path, ByteStream]] | List of file paths or ByteStream objects to convert. | |
| extra | Optional[Union[Dict[str, Any], List[Dict[str, Any]]]] | None | Optional extra information to attach to the FileContent objects. Can be used to store provider-specific information. Values should be JSON serializable. This value can be a list of dictionaries or a single dictionary. If it's a single dictionary, its content is added to the extra of all produced FileContent objects. If it's a list, its length must match the number of sources as they're zipped together. |
Was this page helpful?