DeepsetTogetherAIGenerator
Generate text using large language models hosted on Together AI.
Basic Information
- Type:
deepset_cloud_custom_nodes.generators.togetherai.DeepsetTogetherAIGenerator
- Components it can connect with:
- PromptBuilder: DeepsetTogetherAIGenerator receives the prompt from PromptBuilder.
- DeepsetAnswerBuilder: DeepsetTogetherAIGenerator sends the generated replies to DeepsetAnswerBuilder, which uses them to build
GeneratedAnswer
objects.
Inputs
Required Inputs
Name | Type | Description |
---|---|---|
prompt | String | The prompt with instructions for the model. |
Optional Inputs
Name | Type | Default | Description |
---|---|---|---|
| Dictionary of string and any |
| Additional keyword arguments you want to pass to the model. For a list of parameters you can use, see together.ai API documentation. |
| String |
| A a set of instructions for the model that shapes how the model behaves and responds throughout the interaction. It can include guidelines about the AI's personality, tone, capabilities, constraints, and specific rules it should follow when generating responses. |
|
|
| A callback function invoked when the model receives a new token from the stream. |
Outputs
Name | Type | Description |
---|---|---|
replies | List of strings | Generated responses. |
meta | List of dictionaries | Metadata for each response. |
Overview
DeepsetTogetherAIGenerator
generates answers to queries using models hosted on Together AI. For a complete list of models you can use, check Together AI documentation.
Authentication
You need an API key from Together AI to use their models. For details on obtaining it, see Together AI Quickstart.
Once you have the API key, connect deepset to Together AI on the Connections page. For detailed instructions, see Use Together AI Models.
Usage Example
This query pipeline uses the DeepSeek-R§ model hosted on Together AI:
components:
...
prompt_builder:
type: haystack.components.builders.prompt_builder.PromptBuilder
init_parameters:
template: |-
You are a technical expert.
You answer questions truthfully based on provided documents.
For each document check whether it is related to the question.
Only use documents that are related to the question to answer it.
Ignore documents that are not related to the question.
If the answer exists in several documents, summarize them.
Only answer based on the documents provided. Don't make things up.
If the documents can't answer the question or you are unsure say: 'The answer can't be found in the text'.
These are the documents:
{% for document in documents %}
Document[{{ loop.index }}]:
{{ document.content }}
{% endfor %}
Question: {{question}}
Answer:
llm:
type: deepset_cloud_custom_nodes.generators.togetherai.DeepsetTogetherAIGenerator
init_parameters:
api_key: {"type": "env_var", "env_vars": ["TOGETHERAI_API_KEY"], "strict": false}
model: deepseek-ai/DeepSeek-R1
generation_kwargs:
max_tokens: 650
temperature: 0
seed: 0
answer_builder:
type: haystack.components.builders.answer_builder.AnswerBuilder
...
connections:
...
- sender: prompt_builder.prompt
receiver: llm.prompt
- sender: llm.replies
receiver: answer_builder.replies
...
When building your pipeline in Pipeline Builder, simply drag DeepsetTogetherAIGenerator
from the Connectors group onto the canvas. Then, connect its input prompt
to PromptBuilder
and output replies
to DeepsetAnswerBuilder
.

Init Parameters
Parameter | Type | Possible Values | Description |
---|---|---|---|
| Secret | Default: | Together AI API key. Required. |
| String | Default: | The path to the model to use. |
| String | Default: | The base URL ofd the Together AI API. |
|
| Default: | A callback function called when a new token is received from the stream. This parameter specifies if the generator should stream. To make it stream, set |
| String | Default: | A a set of instructions for the model that shapes how the model behaves and responds throughout the interaction. It can include guidelines about the AI's personality, tone, capabilities, constraints, and specific rules it should follow when generating responses. |
| Dictionary | Default: | Other parameters to use for the model. These parameters are all sent directly to |
| Float | Default: | Timeout for together.ai Client calls, if not set it is inferred from the |
| Integer | Default: | Maximum retries to establish contact with together.ai if it returns an internal error, if not set it is |
Updated 18 days ago