Use Hugging Face Models

Use models hosted on Hugging Face in your pipelines.

You can use any model from Hugging Face in your pipelines. There are a couple of ways to do it:

Prerequisites

You need a Hugging Face token for the API you want to use.

Use Models from Hugging Face

Connect deepset AI Platform to Hugging Face through the Integrations page. You can do so for a single workspace or for the whole organization:

Add Workspace-Level Integration

  1. Click your profile icon and choose Settings.
  2. Go to Workspace>Integrations.
  3. Find the provider you want to connect and click Connect next to them.
  4. Enter the API key and any other required details.
  5. Click Connect. You can use this integration in pipelines and indexes in the current workspace.

Add Organization-Level Integration

  1. Click your profile icon and choose Settings.
  2. Go to Organization>Integrations.
  3. Find the provider you want to connect and click Connect next to them.
  4. Enter the API key and any other required details.
  5. Click Connect. You can use this integration in pipelines and indexes in all workspaces in the current organization.

For details, see Add Integrations.

Then, add a component that uses the model you have in mind. Here are the available components by the model type:

ℹ️

Embedding Models in Query Pipelines and Indexes

The embedding model you use to embed documents in your index must be the same as the embedding model you use to embed the query in your pipeline.

This means the embedders for your indexes and pipelines must match. For example, if you use CohereDocumentEmbedder to embed your documents, you should use CohereTextEmbedder with the same model to embed your queries.


Usage Examples

Check the documentation of the component you want to use for usage examples.