Connect to Model and Service Providers
Connect deepset Cloud to model or service providers, such as Hugging Face, Open AI, Cohere, or Unstructured. This is recommended if you want to use a model by one of these providers or a data processing service.
About This Task
Each connection creates an environment variable with a predefined name for storing the API key. If you connect deepset Cloud with a model or service provider, you no longer need to pass api_key
in your pipelines when using a model or data processing service from this provider.
Currently, supported connections to model providers are:
- Amazon Bedrock and SageMaker
- Azure OpenAI
- Cohere
- Google AI
- Hugging Face (only needed for private models hosted there)
- Nvidia
- OpenAI
- SearchAPI
- Voyage AI
Supported connections to data processing services and data sources are:
- Azure Document Intelligence
- DeepL
- Snowflake
- unstructured.io
Prerequisites
- You must have an active account with the model provider you want to connect.
- Prepare your user access token or credentials as you'll use them to connect the model provider to deepset Cloud. For more information, see:
- User access tokens in Hugging Face.
- Secret API keys in Open AI
- Production keys in co:here
- Model access in Amazon Bedrock
- Getting access to Azure OpenAI
- Getting an API key for Gemini API
- Nvidia personal keys
- Using Document Intelligence models
- DeepL API
- Snowflake website
- unstructured.io documentation
- Google Search API
- VoyageAI
Connect to a Provider
-
Click your initials in the top right corner and select Connections.
-
Click Connect next to the provider.
-
Enter your user access token and submit it.
For detailed instructions per provider, see also Using Hosted Models and External Services.
Updated about 2 months ago