Skip to main content

OllamaTextEmbedder

Computes the embeddings of a list of Documents and stores the obtained vectors in the embedding field of

Basic Information

  • Type: haystack_integrations.components.embedders.ollama.text_embedder.OllamaTextEmbedder

Inputs

ParameterTypeDefaultDescription
textstrText to be converted to an embedding.
generation_kwargsOptional[Dict[str, Any]]NoneOptional arguments to pass to the Ollama generation endpoint, such as temperature, top_p, etc. See the Ollama docs.

Outputs

ParameterTypeDefaultDescription
embeddingList[float]A dictionary with the following keys: - embedding: The computed embeddings - meta: The metadata collected during the embedding process
metaDict[str, Any]A dictionary with the following keys: - embedding: The computed embeddings - meta: The metadata collected during the embedding process

Overview

Work in Progress

Bear with us while we're working on adding pipeline examples and most common components connections.

Computes the embeddings of a list of Documents and stores the obtained vectors in the embedding field of each Document. It uses embedding models compatible with the Ollama Library.

Usage example:

from haystack_integrations.components.embedders.ollama import OllamaTextEmbedder

embedder = OllamaTextEmbedder()
result = embedder.run(text="What do llamas say once you have thanked them? No probllama!")
print(result['embedding'])

Usage Example

components:
OllamaTextEmbedder:
type: ollama.src.haystack_integrations.components.embedders.ollama.text_embedder.OllamaTextEmbedder
init_parameters:

Parameters

Init Parameters

These are the parameters you can configure in Pipeline Builder:

ParameterTypeDefaultDescription
modelstrnomic-embed-textThe name of the model to use. The model should be available in the running Ollama instance.
urlstrhttp://localhost:11434The URL of a running Ollama instance.
generation_kwargsOptional[Dict[str, Any]]NoneOptional arguments to pass to the Ollama generation endpoint, such as temperature, top_p, and others. See the available arguments in Ollama docs.
timeoutint120The number of seconds before throwing a timeout error from the Ollama API.

Run Method Parameters

These are the parameters you can configure for the component's run() method. This means you can pass these parameters at query time through the API, in Playground, or when running a job. For details, see Modify Pipeline Parameters at Query Time.

ParameterTypeDefaultDescription
textstrText to be converted to an embedding.
generation_kwargsOptional[Dict[str, Any]]NoneOptional arguments to pass to the Ollama generation endpoint, such as temperature, top_p, etc. See the Ollama docs.