Skip to main content
For the complete documentation index for agents and LLMs, see llms.txt.

DeepsetFileUploader

Use DeepsetFileUploader to send files from your pipeline run to Haystack Platform temporary file storage and return stable references (file_id and file_name) you can expose as pipeline output so people can download those files from the product.

Think of the component as a handoff step: your pipeline (or another component) already has files, for example a generated spreadsheet or a chart saved to disk. Those files live on the worker where the pipeline ran. DeepsetFileUploader copies them to the platform and gives back IDs the UI and APIs understand, so the same run can offer downloadable attachments instead of only text.

The sole task of this component is to upload files and return references.

When to Use Itโ€‹

  • You generate or export a file during a run (report, CSV, image, PDF, archive) and want it to appear as a download in Playground, jobs, or integrated apps.
  • An upstream component writes files to paths you know, or produces in-memory ByteStream objects, and you need platform-managed file IDs for the rest of the flow or for output.
  • You want credentials and workspace routing in environment variables, not in pipeline YAML.

Configurationโ€‹

For DeepsetFileUploader to work, create the following secrets in the Haystack Platform workspace or organization where you want to use the component:

  • Secret name: DEEPSET_API_URL, secret value: base URL of the deepset API. Set it to https://api.cloud.deepset.ai.
  • Secret name: DEEPSET_API_KEY, secret value: the API key to connect to your Haystack Platform workspace or organization. For details on how to obtain it, see Generate API Keys.

Optionally, set the workspace name in the DEEPSET_WORKSPACE secret. If omitted, the component uses default.

For instructions on how to create secrets, see Add Secrets.

Connectionsโ€‹

DeepsetFileUploader can receive files from any component that produces a list of strings, paths, or ByteStream objects.

It can send the output files to the Output component, so users can download them from the platform.

Usage Exampleโ€‹

Basic DeepsetFileUploader Configurationโ€‹

This is the basic component configuration. The component has no parameters, so you just specify the component type and leave the init parameters empty.

components:
file_uploader:
type: deepset_cloud_custom_nodes.utils.deepset_file_uploader.DeepsetFileUploader
init_parameters: {}

Pipeline Where Another Component Creates Filesโ€‹

This is an example of a pipeline where a custom component writes files to disk and exposes a list of paths that should become downloads.

  components:
report_exporter:
type: my_org.custom_nodes.report_exporter.ReportExporter
init_parameters:
output_format: csv
file_uploader:
type: deepset_cloud_custom_nodes.utils.deepset_file_uploader.DeepsetFileUploader
init_parameters: {}
connections:
- sender: report_exporter.saved_paths
receiver: file_uploader.files
inputs:
query:
- "report_exporter.query"
outputs:
attachments: "file_uploader.files"
max_runs_per_component: 100
metadata: {}

ReportExporter is an example component that must output saved_paths as a list of strings (or paths). The pipeline output key attachments maps to the uploaderโ€™s files list so the platform can treat them as downloadable file references.

Pipeline Where Files Are Supplied at Query Timeโ€‹

This is an example of a pipeline where files are supplied at query time. Users upload files to be translated and the pipeline returns the translated files. Note that the Input component is configured to accept files and query inputs, and the Output component is configured to output output_files and messages.

components:
file_uploader:
type: deepset_cloud_custom_nodes.utils.deepset_file_uploader.DeepsetFileUploader
init_parameters: {}
LLM:
type: haystack.components.generators.chat.llm.LLM
init_parameters:
chat_generator:
init_parameters:
model: gpt-5.4
type: haystack.components.generators.chat.openai_responses.OpenAIResponsesChatGenerator
system_prompt: |-
{% message role="system" %}
You are a professional translator that translates files.
{% endmessage %}
user_prompt: |-
{% message role="user" %}
Translate the following {{ files }} into English.
{% endmessage %}
required_variables: "*"
streaming_callback:

connections: []

max_runs_per_component: 100

metadata: {}

inputs:
files:
- file_uploader.files
- LLM.files
query:
- LLM.messages

outputs:
output_files: file_uploader.files
messages: LLM.messages

Wire your client or Playground input schema so the pipeline receives a files value compatible with the uploader (paths or ByteStream-like payloads your platform sends).

Parametersโ€‹

Inputsโ€‹

ParameterTypeDefaultDescription
filesList[Union[str, Path, ByteStream]]Files to upload: local paths or in-memory streams. For paths, the file name is taken from the path. For ByteStream, set file_name in meta.

Outputsโ€‹

ParameterTypeDefaultDescription
filesList[Dict[str, str]]One entry per uploaded file: file_id (platform ID) and file_name (original name).

Init Parametersโ€‹

DeepsetFileUploader has no init parameters. Authentication and workspace selection use environment variables only.

Run Method Parametersโ€‹

These map to the componentโ€™s run() method, so you can pass them at query time through the API, in Playground, or when running a job. For details, see Modify Pipeline Parameters at Query Time.

ParameterTypeDefaultDescription
filesList[Union[str, Path, ByteStream]]Files to upload: paths or streams; for ByteStream, set file_name in meta.