Using Hosted Models and External Services

deepset AI Platform is integrated with multiple model and service providers. Learn how to benefit from it in your pipelines.

📘

Need a custom connection?

To use a third-party provider or service that’s not listed in deepset Connections, you can create a custom component to connect to it. For details, see Custom Components.

Accessing Connections

All deepset connections are listed on the Connections page that you can access by clicking your credentials and choosing Connections:

The connections menu expanded

Supported Providers

Currently, on the Connections page you can set up connections with the following model providers:

  • Amazon Bedrock and SageMaker
  • Azure OpenAI
  • Cohere
  • Google AI
  • Hugging Face (only needed for private models hosted there)
  • MongoDB
  • NVIDIA
  • OpenAI
  • SearchAPI
  • Together AI
  • Voyage AI
  • Weights & Biases

Supported connections to data processing services and data sources are:

  • Azure AI Document Intelligence
  • DeepL
  • Snowflake
  • Unstructured

Adding Connections to Your Pipeline

  1. Add the provider's API key on the Connections page.
  2. Add a component that supports the integration to your pipeline.

Find the integration you want to use and follow the detailed steps below:

Integrations

There are also other model providers and frameworks you can use in your pipelines, even though they're not listed on the Connections page. To use these integrations, add a secret on the Secrets page and then pass the secret name as the API key required by the provider. For details, see Add Secrets to Connect to Third Party Providers.

Currently, deepset AI Platform supports the following integrations:

And the following integrations: