Skip to main content

AnthropicChatGenerator

Use Anthropic's chat completion models.

Basic Information

  • Type: haystack_integrations.components.generators.anthropic.chat.chat_generator.AnthropicChatGenerator
  • Components it can connect with:
    • ChatPromptBuilder: AnthropicChatGenerator receives chat messages from ChatPromptBuilder.
    • OutputAdapter: AnthropicChatGenerator can send generated replies to OutputAdapter configured to convert them into a list of strings that DeepsetAnswerBuilder can accept.

Inputs

ParameterTypeDefaultDescription
messagesList[ChatMessage]A list of ChatMessage objects representing the input messages.
generation_kwargsOptional[Dict[str, Any]]NoneAdditional keyword arguments for the model.
streaming_callbackOptional[StreamingCallbackT]NoneAn optional callback function to handle streaming chunks.
toolsOptional[Union[List[Tool], Toolset]]NoneA list of tool objects or a toolset that the model can use. Each tool must have a unique name.

Outputs

ParameterTypeDefaultDescription
repliesList[ChatMessage]A list of generated replies.

Overview

For a list of Anthropic models you can use, see Anthropic Models.

You can customize how the text is generated by passing parameters to the Anthropic API. Use the generation_kwargs parameter to do this. Any parameter that works with anthropic.Message.create also works here. For a complete list of parameters, see Anthropic API documentation.

Authentication

To use this component, connect Haystack Platform with Anthropic first. You'll need an Anthropic API key to do this.

For details on using Anthropic models, see Use Anthropic Models.

Usage Example

Using the Component in a Pipeline

This is a RAG pipeline that uses Claude Sonnet 4:

components:
bm25_retriever:
type: haystack_integrations.components.retrievers.opensearch.bm25_retriever.OpenSearchBM25Retriever
init_parameters:
document_store:
type: haystack_integrations.document_stores.opensearch.document_store.OpenSearchDocumentStore
init_parameters:
hosts:
- ${OPENSEARCH_HOST}
http_auth:
- ${OPENSEARCH_USER}
- ${OPENSEARCH_PASSWORD}
use_ssl: true
verify_certs: false
top_k: 20

query_embedder:
type: haystack.components.embedders.sentence_transformers_text_embedder.SentenceTransformersTextEmbedder
init_parameters:
model: intfloat/e5-base-v2

embedding_retriever:
type: haystack_integrations.components.retrievers.opensearch.embedding_retriever.OpenSearchEmbeddingRetriever
init_parameters:
document_store:
type: haystack_integrations.document_stores.opensearch.document_store.OpenSearchDocumentStore
init_parameters:
hosts:
- ${OPENSEARCH_HOST}
http_auth:
- ${OPENSEARCH_USER}
- ${OPENSEARCH_PASSWORD}
use_ssl: true
verify_certs: false
top_k: 20

document_joiner:
type: haystack.components.joiners.document_joiner.DocumentJoiner
init_parameters:
join_mode: concatenate

ranker:
type: haystack.components.rankers.transformers_similarity.TransformersSimilarityRanker
init_parameters:
model: intfloat/simlm-msmarco-reranker
top_k: 8

chat_prompt_builder:
type: haystack.components.builders.chat_prompt_builder.ChatPromptBuilder
init_parameters:
template:
- _content:
- text: |
You are a helpful assistant answering questions based on the provided documents.
If the documents don't contain the answer, say so.
Do not use your own knowledge.
_role: system
- _content:
- text: |
Documents:
{% for document in documents %}
Document [{{ loop.index }}]:
{{ document.content }}
{% endfor %}

Question: {{ query }}
_role: user

anthropic_chat_generator:
type: haystack_integrations.components.generators.anthropic.chat.chat_generator.AnthropicChatGenerator
init_parameters:
model: claude-sonnet-4-20250514
generation_kwargs:
temperature: 0.7
max_tokens: 500

output_adapter:
type: haystack.components.converters.output_adapter.OutputAdapter
init_parameters:
template: '{{ replies[0] }}'
output_type: List[str]

answer_builder:
type: haystack.components.builders.answer_builder.AnswerBuilder
init_parameters: {}

connections:
- sender: bm25_retriever.documents
receiver: document_joiner.documents
- sender: query_embedder.embedding
receiver: embedding_retriever.query_embedding
- sender: embedding_retriever.documents
receiver: document_joiner.documents
- sender: document_joiner.documents
receiver: ranker.documents
- sender: ranker.documents
receiver: chat_prompt_builder.documents
- sender: ranker.documents
receiver: answer_builder.documents
- sender: chat_prompt_builder.prompt
receiver: anthropic_chat_generator.messages
- sender: anthropic_chat_generator.replies
receiver: output_adapter.replies
- sender: output_adapter.output
receiver: answer_builder.replies

max_runs_per_component: 100

inputs:
query:
- bm25_retriever.query
- query_embedder.text
- ranker.query
- chat_prompt_builder.query
- answer_builder.query
filters:
- bm25_retriever.filters
- embedding_retriever.filters

outputs:
documents: ranker.documents
answers: answer_builder.answers

metadata: {}

Parameters

Init Parameters

These are the parameters you can configure in Pipeline Builder:

ParameterTypeDefaultDescription
api_keySecretSecret.from_env_var('ANTHROPIC_API_KEY')The Anthropic API key.
modelstrclaude-sonnet-4-20250514The name of the Anthropic model to use.
streaming_callbackOptional[Callable[[StreamingChunk], None]]NoneAn optional callback function to handle streaming chunks.
system_promptOptional[str]NoneAn optional system prompt to use for generation.
generation_kwargsOptional[Dict[str, Any]]NoneAdditional keyword arguments for generation.
timeoutOptional[float]NoneThe timeout for request.
max_retriesOptional[int]NoneThe maximum number of retries if a request fails.

Run Method Parameters

These are the parameters you can configure for the component's run() method. This means you can pass these parameters at query time through the API, in Playground, or when running a job. For details, see Modify Pipeline Parameters at Query Time.

ParameterTypeDefaultDescription
messagesList[ChatMessage]A list of ChatMessage objects representing the input messages.
generation_kwargsOptional[Dict[str, Any]]NoneAdditional keyword arguments for generation. For a complete list, see Anthropic API documentation.
streaming_callbackOptional[Callable[[StreamingChunk], None]]NoneAn optional callback function to handle streaming chunks.
toolsOptional[Union[List[Tool], Toolset]]NoneA list of tool objects or a toolset that the model can use. Each tool must have a unique name.