Skip to main content

Using Hosted Models and External Services

deepset AI Platform is integrated with multiple model and service providers. Learn how to benefit from it in your pipelines.


Need a custom integration?

To use a third-party provider or service that’s not listed in deepset Integrations, you can create a custom component to connect to it. For details, see Create a Custom Component.

Accessing Integrations​

All deepset integrations are listed on the Integrations page in Settings. To access settings, click your profile icon and choose Settings:

  • To see workspace-level integrations, go to Workspace>Integrations.
  • To see organization-level integrations, go to Organization>Integrations.

For more information about integrations and their scope, see Secrets and Integrations and Add Integrations.

Supported Providers​

Currently, on the Connections page you can set up connections with the following model providers:

  • Amazon Bedrock and SageMaker
  • Azure OpenAI
  • Cohere
  • Google AI
  • Hugging Face (only needed for private models hosted there)
  • MongoDB
  • NVIDIA
  • OpenAI
  • SearchAPI
  • Together AI
  • Voyage AI
  • Weights & Biases

Supported connections to data processing services and data sources are:

  • Azure AI Document Intelligence
  • DeepL
  • Snowflake
  • Unstructured

Adding Integrations to Your Pipeline​

  1. Add the provider's API key on the Integrations page.
  2. Add a component that supports the integration to your pipeline.

Find the integration you want to use and follow the detailed steps below:

Integrations with Secrets​

There are also other model providers and frameworks you can use in your pipelines, even though they're not listed on the Integrations page. To use these integrations, add a secret on the Secrets page and then pass the secret name as the API key required by the provider. For details, see Add Secrets.