ChatPromptBuilder

Use ChatPromptBuilder to send a prompt to a ChatGenerator.

Basic Information

For detailed overview, inputs and outputs, see Haystack documentation.

For a detailed explanation on how to write prompts with variables, see Writing Prompts in deepset AI Platform.

Usage Example

With ChatMessages

ChatPromptBuilder sends the instructions to a ChatGenerator in the form of a list of ChatMessage objects. You pass the instructions in the template parameter, which must follow the ChatMessage format:

- content: 
	- content_type:  # replace this with the content type, supported content types are: text, image, tool_call, tool_call_result
  # content may contain variables
  role: role # supported roles are: user, system, assistant, tool

For example:

- _content:
    - text: |
        You are a helpful assistant answering the user's questions.
        If the answer is not in the documents, rely on the web_search tool to find information.
        Do not use your own knowledge.
  _role: system
- _content:
    - text: |
        Question: {{ query }}
  _role: user

You could also pass documents in the prompt, like this:

- _content:
    - text: |
        Here are the results that your last search yielded.
        {% for doc in documents %}
        {{doc.content}}
        {% endfor %}
        
        Question: {{ query }}
  _role: user

With Jinja2 Syntax

You can use Jinja2 strings in ChatPromptBuilder's template parameter through the {% message %} tag. This makes it possible to create structured ChatMessages with mixed content types, such as images and text.

The {% message %} tag supports the following attributes:

  • role (mandatory): system, user, assistant, or tool
  • name (optional): Participant name
  • meta (optional): Metadata dictionary

For example, this ChatPromptBuilder contains instructions for follow up question classification and rewriting queries at the start of the pipeline. There are two chat messages, one with role "system" and another one with role "assistant". It also includes chat history.

components:
  ChatPromptBuilder:
    type: haystack.components.builders.chat_prompt_builder.ChatPromptBuilder
    init_parameters:
    template: |
      {% message role="system" %}
      You are an excellent labeling tool.
      You receive a chat history.
      If the last user question is asking for information from a database, output "QUESTION".
      This is also the case when something needs to be explained.

      If the last user question contains instructions for working with a given passage, output       "PASSAGE".
      If the chat history is empty, the label "PASSAGE" cannot be used.

      Example questions referring to the passage are:
      “write a summary of this”, “in bullet points”, or “rephrase as an email”.

      For the "QUESTION" label, you additionally output a "query".
      The "query" must be in natural language, since it will be processed by a hybrid of             keyword and semantic search. Therefore, be as explicit as possible with the keywords.
      If both X and Y are asked about, only include Y in the "query" and do not repeat X.
      Abbreviations such as “ArbVG”, “BVwG”, or “Abs”, as well as all paragraphs, must remain         unchanged and must not be reformulated in the "query".
      If the question does not need context from the chat history, output it unchanged in the         "query".

      For the "PASSAGE" label, you output NOTHING additional.
      Simply: "PASSAGE".

     {% endmessage %}

     {% for message in chat_history %}
     {% message role=message.role %}
     {{ message.text }}
     {% endmessage %}
     {% endfor %}
     
     {% message role="assistant" %}
     { "label": "
     {% endmessage %}