MistralTextEmbedder
Embeds a query string using Mistral AI models and returns a vector for use in semantic search.
Embedding Models in Query Pipelines and Indexes
The embedding model you use to embed documents in your indexing pipeline must be the same as the embedding model you use to embed the query in your query pipeline.
This means the embedders for your indexing and query pipelines must match. For example, if you use CohereDocumentEmbedder to embed your documents, you should use CohereTextEmbedder with the same model to embed your queries.
Key Features
- Embeds query strings using Mistral AI embedding models.
- Returns the query as a vector for use with embedding retrievers.
- Configurable prefix and suffix text for custom formatting.
- Outputs usage metadata alongside the embedding.
- Lightweight component for query pipeline integration.
Configuration
Create a secret with your Mistral API key. Use MISTRAL_API_KEY as the secret key. For instructions, see Create Secrets. Get your API key from Mistral AI.
- Drag the
MistralTextEmbeddercomponent onto the canvas from the Component Library. - Click the component to open the configuration panel.
- On the General tab:
- Enter the name of the Mistral embedding model to use, such as
mistral-embed.
- Enter the name of the Mistral embedding model to use, such as
- Go to the Advanced tab to configure the API key, API base URL, prefix and suffix text.
Connections
MistralTextEmbedder accepts a text string as input and outputs an embedding (list of floats) and a meta dictionary with model name and usage statistics.
Connect the Input component to its text input. Connect its embedding output to an embedding retriever to find semantically similar documents.
For embedding documents in indexes, use MistralDocumentEmbedder instead. Make sure to use the same model in both components.
Usage Example
This example shows a query pipeline that embeds a user query using Mistral and retrieves relevant documents.
components:
MistralTextEmbedder:
type: haystack_integrations.components.embedders.mistral.text_embedder.MistralTextEmbedder
init_parameters:
api_key:
type: env_var
env_vars:
- MISTRAL_API_KEY
strict: false
model: mistral-embed
EmbeddingRetriever:
type: haystack_integrations.components.retrievers.opensearch.embedding_retriever.OpenSearchEmbeddingRetriever
init_parameters:
document_store:
type: haystack_integrations.document_stores.opensearch.document_store.OpenSearchDocumentStore
init_parameters:
hosts:
index: 'mistral-embeddings'
max_chunk_bytes: 104857600
embedding_dim: 1024
return_embedding: false
method:
mappings:
settings:
create_index: true
http_auth:
use_ssl:
verify_certs:
timeout:
top_k: 10
connections:
- sender: MistralTextEmbedder.embedding
receiver: EmbeddingRetriever.query_embedding
max_runs_per_component: 100
metadata: {}
inputs:
query:
- MistralTextEmbedder.text
- EmbeddingRetriever.query
outputs:
documents: EmbeddingRetriever.documents
Parameters
Inputs
| Parameter | Type | Default | Description |
|---|---|---|---|
| text | str | The string to embed. |
Outputs
| Parameter | Type | Default | Description |
|---|---|---|---|
| embedding | List[float] | The embedding of the input text. | |
| meta | Dict[str, Any] | Information about the usage of the model, including model name and token usage. |
Init Parameters
These are the parameters you can configure in Pipeline Builder:
| Parameter | Type | Default | Description |
|---|---|---|---|
| api_key | Secret | Secret.from_env_var('MISTRAL_API_KEY') | The Mistral API key. |
| model | str | mistral-embed | The name of the Mistral embedding model to be used. |
| api_base_url | Optional[str] | https://api.mistral.ai/v1 | The Mistral API Base url. For more details, see Mistral docs. |
| prefix | str | A string to add to the beginning of each text. | |
| suffix | str | A string to add to the end of each text. | |
| timeout | Optional[float] | None | Timeout for Mistral client calls. If not set, it defaults to either the OPENAI_TIMEOUT environment variable, or 30 seconds. |
| max_retries | Optional[int] | None | Maximum number of retries to contact Mistral after an internal error. If not set, it defaults to either the OPENAI_MAX_RETRIES environment variable, or set to 5. |
| http_client_kwargs | Optional[Dict[str, Any]] | None | A dictionary of keyword arguments to configure a custom httpx.Clientor httpx.AsyncClient. For more information, see the HTTPX documentation. |
Run Method Parameters
These are the parameters you can configure for the component's run() method. This means you can pass these parameters at query time through the API, in Playground, or when running a job. For details, see Modify Pipeline Parameters at Query Time.
| Parameter | Type | Default | Description |
|---|---|---|---|
| text | str | The string to embed. |
Was this page helpful?