Configure an Agent: Advanced Settings
Configure advanced model and agent parameters.
About This Task
- For full Agent component documentation, see Agent.
- For an explanation of Agent's workflow, see AI Agent.
Prerequisites
Complete the following tasks before you start: Configure an Agent: Model, System Prompt, and Tools.
Access Agent's Advanced Settings
- In Builder, click Model on the Agent component card to open its configuration panel.
- Switch to the Advanced tab.
- Configure the settings as needed. The Parameters section contains model-specific settings. For details on how to configure the model, refer to the model's documentation.
The Other section contains Agent-specific settings:
max_agent_steps: The maximum number of actions the agent can perform. For more complicated tasks, you may need to increase this value. Note that increasing this value may increase the cost and time of the task.exit_conditions: The conditions that cause the agent to stop. For example, you can configure the Agent to stop after a tool is used or once the model returns a text response.- To stop the agent when it generates a text response, choose
text. - To stop the agent after a specific tool is used, choose the tool's name from the list.
- You can choose multiple exit conditions and the agent stops when any of the conditions is met. For example, to stop the agent when it generates a text response or after a specific tool is used, choose
textand the tool's name.
- To stop the agent when it generates a text response, choose
retry_on_tool_failure: Automatically retries the agent if a tool call fails. This is useful if the tool call times out or temporarily fails.
Configure a Different Model
You can choose a model other than those available in the Model list. This requires you to switch to YAML and set the model there. Model is provided to the Agent using the chat_generator parameter. Use the chat_generator parameter to do this. Make sure there is a ChatGenerator component for the model you want to use. Make sure Haystack Platform is connected to the model provider. For details on how to connect, see Using Hosted Models and External Services.
For example, to use a model hosted on Together AI, pass TogetherAIChatGenerator as the value of the Agent's chat_generator parameter:
agent:
init_parameters:
chat_generator:
type: haystack_integrations.components.generators.togetherai.chat.chat_generator.TogetherAIChatGenerator
init_parameters:
api_key: {"type": "env_var", "env_vars": ["TOGETHERAI_API_KEY"], "strict": false}
model: deepseek-ai/DeepSeek-R1
generation_kwargs:
max_tokens: 650
temperature: 0
seed: 0
Was this page helpful?