Working with Large Language Models (LLMs)
LLMs show remarkable capabilities in understanding and generating human-like text. Have a look at how you can use them in your deepset Cloud pipelines.
LLMs in Your Pipelines
You can easily integrate LLMs in your deepset Cloud pipelines using the versatile PromptNode. PromptNode works well for generative question answering and other NLP tasks such as text classification, summarization, and more. It performs the specific task you define within the prompt you provide.
Creating an LLM App
You can start with an out-of-the-box template and then experiment with models and prompts:
- Create a Pipeline from the template and choose the Generative Question Answering Template with FLAN-T5. It uses PromptNode with the free FLAN-T5 model by Google.
By changing the prompt and the model, you can adjust it to NLP tasks beyond generative question answering. - Experiment with your pipeline settings:
- Try changing the model. PromptNode works with various models, including GPT-4, Cohere's, or Anthropic's models. For a full list, see PromptNode.
- Experiment with different prompts.
- PromptNode comes with a set of ready-made prompts for you to try out. Simply pass the prompt name in the
default_prompt_template
parameter. For a full list of prompts, see Prompt Templates. - Experiment with your prompts in Prompt Explorer, a sandbox environment in deepset Cloud, to refine and test your prompts.
- PromptNode comes with a set of ready-made prompts for you to try out. Simply pass the prompt name in the
Learn More
Here's a collection of information you may want to explore for more information related to LLMs and generative AI in deepset Cloud.
About LLMs
Generative AI in Practice
- Use Case: Generative AI Systems
- Tutorial: Building a Summarization System with a Large Language Model
- PromptNode
Prompt Engineering
Updated 18 days ago