GoogleGenAIChatGenerator
Complete chats using Google's Gemini models through the Google Gen AI SDK.
Basic Information
- Type:
haystack_integrations.components.generators.google_genai.chat.chat_generator.GoogleGenAIChatGenerator - Components it can connect with:
ChatPromptBuilder:GoogleGenAIChatGeneratorreceives a rendered prompt fromChatPromptBuilder.DeepsetAnswerBuilder:GoogleGenAIChatGeneratorsends the generated replies toDeepsetAnswerBuilderthroughOutputAdapter(see Usage Examples below).
Inputs
| Parameter | Type | Default | Description |
|---|---|---|---|
| messages | List[ChatMessage] | A list of ChatMessage instances representing the input messages. | |
| generation_kwargs | Optional[Dict[str, Any]] | None | Additional keyword arguments for the model. For details, see model documentation. |
| safety_settings | Optional[List[Dict[str, Any]]] | None | Safety settings for content filtering. If provided, it will override the default settings. |
| streaming_callback | Optional[StreamingCallbackT] | None | A callback function that is called when a new token is received from the stream. |
| tools | Optional[Union[List[Tool], Toolset]] | None | A list of Tool objects or a Toolset that the model can use. |
Outputs
| Parameter | Type | Default | Description |
|---|---|---|---|
| replies | List[ChatMessage] | A list containing the generated ChatMessage responses. |
Overview
GoogleGenAIChatGenerator provides an interface to Google's Gemini models through the new google-genai SDK,
supporting models like gemini-2.0-flash and other Gemini variants.
Authorization
You need Google Studio API key to use this component. Connect deepset to your Google AI Studio account on the Integrations page.
Connection Instructions
- Click your profile icon in the top right corner and choose Integrations.

- Click Connect next to the provider.
- Enter your API key and submit it.
Usage Example
Initializing the Component
components:
GoogleGenAIChatGenerator:
type: haystack_integrations.components.generators.google_genai.chat.chat_generator.GoogleGenAIChatGenerator
init_parameters:
Parameters
Init Parameters
These are the parameters you can configure in Pipeline Builder:
| Parameter | Type | Default | Description |
|---|---|---|---|
| api_key | Secret | Secret.from_env_var(['GOOGLE_API_KEY', 'GEMINI_API_KEY'], strict=False) | Google API key, defaults to the GOOGLE_API_KEY and GEMINI_API_KEY environment variables. Not needed if using Vertex AI with Application Default Credentials. Go to Google AI Studio for a Gemini API key. Go to Google Cloud Vertex AI for a Vertex AI API key. |
| api | Literal['gemini', 'vertex'] | gemini | The API to use. Either "gemini" for the Gemini Developer API or "vertex" for Vertex AI. |
| vertex_ai_project | Optional[str] | None | Google Cloud project ID for Vertex AI. Required when using Vertex AI with Application Default Credentials. |
| vertex_ai_location | Optional[str] | None | Google Cloud location for Vertex AI (for example, "us-central1", "europe-west1"). Required when using Vertex AI with Application Default Credentials. |
| model | str | gemini-2.0-flash | Name of the model to use (for example, "gemini-2.0-flash") |
| generation_kwargs | Optional[Dict[str, Any]] | None | Configuration for generation (temperature, max_tokens, , and more). |
| safety_settings | Optional[List[Dict[str, Any]]] | None | Safety settings for content filtering. |
| streaming_callback | Optional[StreamingCallbackT] | None | A callback function that is called when a new token is received from the stream. |
| tools | Optional[Union[List[Tool], Toolset]] | None | A list of Tool objects or a Toolset that the model can use. Each tool should have a unique name. |
Run Method Parameters
These are the parameters you can configure for the component's run() method. This means you can pass these parameters at query time through the API, in Playground, or when running a job. For details, see Modify Pipeline Parameters at Query Time.
| Parameter | Type | Default | Description |
|---|---|---|---|
| messages | List[ChatMessage] | A list of ChatMessage instances representing the input messages. | |
| generation_kwargs | Optional[Dict[str, Any]] | None | Configuration for generation. |
| safety_settings | Optional[List[Dict[str, Any]]] | None | Safety settings for content filtering. |
| streaming_callback | Optional[StreamingCallbackT] | None | A callback function that is called when a new token is received from the stream. |
| tools | Optional[Union[List[Tool], Toolset]] | None | A list of Tool objects or a Toolset that the model can use. |
Was this page helpful?