Feature List
Haystack Enterprise Platform is an end-to-end platform for building, deploying, and managing AI applications in production. From RAG systems to AI agents, it gives you full control over every layer of the stack while handling the infrastructure complexity for you.
AI Agents
Build autonomous, multi-step AI systems that reason, plan, and act. The Haystack Platform Agent uses an LLM to decide which tools to call and loops until it reaches a result, handling tasks far beyond simple question answering. Our agents are:
- Model-agnostic: Use any LLM as the Agent's brain and easily swap them out.
- Rich tool ecosystem: Give your Agent access to pipelines, custom Python functions, and MCP servers as tools.
- Controllable: Define exit conditions so the Agent stops exactly when and how you want.
- Context-aware: Built-in conversation memory tracks history and tool usage across turns.
- Real-time streaming: Stream Agent responses as they're generated.
- Full observability: Trace every Agent decision with Langfuse or Weights & Biases Weave.
Start with ready-made Agent templates and customize from there. For details, see AI Agents and Building AI Agents.
Visual Pipeline Builder
Design AI applications visually with Pipeline Builder, an intuitive drag-and-drop editor.
- Drag and drop from a library of 180+ pre-built components to assemble your pipeline.
- Switch between visual editing and YAML with real-time synchronization.
- Build complex flows with branching, loops, and conditional routing.
- Export to YAML or Python for local development and version control.
- Jump-start with curated, tested pipeline templates covering RAG, agents, data processing, and more.
To learn more, see Create a Pipeline in Studio. To understand pipelines, see Pipelines
Model Flexibility
Avoid vendor lock-in. Haystack Enterprise Platform is model-agnostic, so you can swap LLMs, embedding and ranking models without changing your pipeline architecture.
Supported providers include OpenAI, Anthropic, Azure OpenAI, Amazon Bedrock, Google Gemini, Cohere, Mistral, NVIDIA, Meta Llama, Together AI, and more. For the full list, see Supported Connections and Integrations.
Multimodal AI
Build systems that go beyond text. Process, understand, and generate content across multiple data types, including audio, images, and documents:
- Audio: Transcribe speech to text with components like Whisper and build search over audio content.
- Images: Use multimodal LLMs (GPT-4o, Claude, Gemini, and others) to analyze and reason over images.
- Documents: Extract structured data from PDFs, Office files, and scanned documents using Azure Document Intelligence, Unstructured.io, or Docling.
For more details, see Multimodal Systems.
MCP Support
Connect your AI applications to external systems through the Model Context Protocol (MCP). Use the official deepset MCP server to expose Haystack Platform pipelines as tools for any MCP-compatible agent, or connect your agents to third-party MCP servers.
For details, see Agent Tools and Model Context Protocol.
Data Ingestion and Processing
Bring in data from any source and prepare it for your AI applications:
- File processing: Handle PDFs, Office documents, HTML, images, and audio with out-of-the-box converters and preprocessors. For details, see Preparing Your Data.
- External services: Process files with Azure Document Intelligence, Unstructured.io, or DeepL for translation.
- Metadata: Attach, filter, and use metadata throughout your pipelines.
Flexible Document Stores
Choose where your data lives. Haystack Enterprise Platform supports multiple document stores so you can use the vector database that fits your stack:
- Managed: OpenSearch (fully managed by Haystack Platform).
- Bring your own: Elasticsearch, MongoDB Atlas, Pinecone, Qdrant, Weaviate, PGVector.
- Databases: Connect to Snowflake for structured data queries.
You can also bring your own S3 bucket or OpenSearch cluster through VPC integration. For an overview, see Document Stores.
Custom Components
Extend the platform with your own logic. Write custom Python components and use them alongside built-in ones in your pipelines.
- Create reusable, versioned components shared across your organization.
- Use the inline Code component for quick, single-pipeline customizations.
- Securely manage API keys with secrets—no credentials in code.
For details, see Add Custom Code.
Prompt Engineering
Iterate on prompts faster with built-in tools:
- Test and refine prompts in Prompt Explorer with live pipeline responses.
- Compare prompt performance side by side across up to three pipelines.
- Save prompts to a library and update them across pipelines without redeployment.
For details, see Engineering Prompts.
Production-Grade Infrastructure
Deploy with confidence. Haystack Enterprise Platform handles scaling, reliability, and performance so you can focus on your application.
- One-click deployment Move from prototype to production in seconds.
- Autoscaling: Pipelines scale automatically with traffic (up to ten replicas for production workloads). You can configure replicas and standby behavior in the pipeline's Settings tab.
- Service levels: Mark pipelines as Development or Production to control standby behavior and resource allocation. For details, see Pipeline Service Levels.
- CI/CD: Automate pipeline deployment with GitHub Actions.
Observability and Tracing
Understand exactly what happens inside your AI applications:
- Pipeline logs: View component-level logs with timestamps, messages, and log levels.
- Live debugger: Inspect component behavior in real time from Pipeline Builder.
- Langfuse integration: Trace full query journeys with spans, latency breakdowns, and dependency maps. For details, see Trace with Langfuse.
- Weights & Biases Weave: Monitor ML telemetry, token counts, and performance metrics. For details, see Use Weights & Biases.
- Remote debugging: Connect to running pipelines through a VS Code tunnel.
For more information on observability, see Trace Your Pipelines.
User Feedback
Test your AI applications with real users and improve continuously:
- Shareable prototypes: Generate branded, shareable links to your pipeline—no login required for testers. Customize with your logo and colors. FOr more information, see Share a Pipeline Prototype.
- Structured feedback: Collect ratings, tags, and comments from users. Group, filter, and export feedback for analysis. For details, check Collect User Feedback.
- Playground: Test pipeline responses interactively before sharing. For instructions how to do it, see Testing Your Pipeline.
Batch Processing with Jobs
Run queries at scale using Jobs:
- Process query sets in bulk across your entire dataset.
- Run queries once on all files or repeat per individual file.
- Share formatted results with anyone—no login required.
Enterprise Security and Access Control
Keep your data and applications secure with enterprise-grade access management:
- Role-based access control: Assign preset or custom roles with granular permissions at the organization and workspace levels. For details, see User Roles and Permissions.
- Single sign-on (SSO): Authenticate through your identity provider. For details, see Enable SSO.
- Secrets management: Store API keys and credentials securely, scoped to workspaces or organizations. To learn about secrets, see Secrets and Integrations.
- VPC integration: Run with your own OpenSearch cluster and S3 bucket for full data isolation. For details, see Connect Your Own File Storage.
- Workspace isolation: Organize teams and projects into isolated workspaces (up to 100 per organization).
REST API and SDK
Integrate Haystack Enterprise Platform into your applications and workflows:
- REST API: Full programmatic access to pipelines, files, indexes, jobs, and feedback. For full reference, see API.
- Python SDK: Upload files, manage pipelines, and automate workflows with Python or the CLI. To learn more, see Working with the SDK.
- MCP server Expose Haystack Platform pipelines as tools for any MCP-compatible agent.
For licensing information about third-party software used in Haystack Enterprise Platform, see Third Party Software.
Was this page helpful?