Feature List
You can use deepset Cloud to design your LLM app by combining different components into pipelines using an intuitive interface. Check the features deepset Cloud offers.
Data Processing
- Preprocess your data using out-of-the-box components that can handle different file types. For more information, see Converters and PreProcessing Data with Pipeline Nodes.
- Process your files using the Azure AI Document Intelligence service. For details, see Use Azure Document Intelligence.
Data Sources
Use your own OpenSearch cluster, S3 bucket, or Snowflake database to store the data your deepset Cloud pipelines run on. For more information, see Setting Up Your VPC and Use Snowflake Database.
Pipeline Templates
Create an app in no time with ready-made templates that cover numerous use cases. Our templates are tested and curated, and they work out of the box. For a list of available templates with explanations, see Pipeline Templates.
Prompt Engineering
- Test and edit your prompts in Prompt Studio, or use prompt templates curated by deepset.
- Save your prompts to use later or update them in your pipelines directly from Prompt Studio.
- Compare prompts across up to three pipelines.
For details, see Engineering Prompts.
Model Agnostic
Easily swap LLMs thanks to deepset Cloud's model agnostic approach.
- Supported models:
- Anthropic's Claude models
- OpenAI InstructGPT models, including gpt-3.5., gpt-4, and gpt-4o
- Cohere's command and generation models
- Llama 2 and Llama3, including Llama 3.1
- Hugging Face transformers (all text2text-generation models)
- Models you can run remotely:
- OpenAI models available on Azure (for a full list of models, see Azure documentation)
- Open source models hosted on Amazon SageMaker
- Text-generation models hosed on Amazon Bedrock (for a full list, see Amazon Bedrock Documentation)
To use an LLM in your pipeline, add PromptNode and pass the model name in its model_name_or_path
parameter. For details, see PromptNode models.
For instructions on using hosted LLMs, see Using Hosted LLMs in Your Pipelines.
Batch Question Answering
Use the Jobs functionality to seamlessly gather consistent information across your datasets. With Jobs, you can:
- Process queries in bulk.
- Run your query set once on all your files or repeat the queries per individual file.
- Easily format the results and share them with anyone without the need to log in.
Shareable Pipeline Prototypes
Show your prototypes to others, let them test, and collect their feedback. Customize prototypes with your brand colors and logos and share them with anyone without needing to set up accounts or log in.
Dependable Infrastructure
- Deploy your pipeline and let deepset Cloud take care of all the scaling.
- Indicate which pipelines are in production and which are in development to ensure they meet the requirement of reliability and resources. See also Pipeline Service Levels.
Monitoring
Monitor the groundedness of your RAG pipelines and analyze the referenced documents. For details, see Check the Groundedness Score.
REST API
Interact with deepset Cloud using a powerful REST API. Plug deepset Cloud pipelines into your interface to use them in your apps.
Updated 3 months ago