OpenAPIServiceConnector
Dynamically invoke OpenAPI service methods using information from a ChatMessage. OpenAPIServiceConnector acts as an interface between deepset and OpenAPI services.
Basic Informationβ
- Type:
haystack_integrations.connectors.openapi_service.OpenAPIServiceConnector - Components it can connect with:
- ChatGenerators: It receives messages from a ChatGenerator.
- OutputAdapter: It sends the service response to an OutputAdapter that converts it to a format subsequent components can use.
Inputsβ
| Parameter | Type | Default | Description |
|---|---|---|---|
| messages | List[ChatMessage] | A list of ChatMessage objects containing the messages to be processed. The last message is expected to contain parameter invocation payload. | |
| service_openapi_spec | Dict[str, Any] | The OpenAPI JSON specification object of the service to be invoked. All ref values must be resolved. | |
| service_credentials | Optional[Union[dict, str]] | None | The credentials to be used for authentication with the service. Currently, only the http and apiKey OpenAPI security schemes are supported. Use http for Basic, Bearer, or other HTTP authentication schemes. Use apiKey for API keys and cookie authentication. |
Outputsβ
| Parameter | Type | Default | Description |
|---|---|---|---|
| service_response | Dict[str, Any] | A list of ChatMessage objects, each containing the response from the service. The response is in JSON format, and the content attribute of the ChatMessage contains the JSON string. |
Overviewβ
OpenAPIServiceConnector connects the deepset to OpenAPI services, making it possible to call operations as defined in the OpenAPI specification of the service.
It works with the ChatMessage data class, using the message payload to identify the method to call and the parameters to pass. The payload must be a JSON-formatted function call string that includes the method name and parameters. The connector then invokes the method on the OpenAPI service and returns the response as a ChatMessage.
Before using this component, you typically resolve service endpoint parameters with the OpenAPIServiceToFunctions component.
Keep in mind that OpenAPIServiceConnector is usually not used on its own. Itβs designed to be part of a pipeline with the OpenAPIServiceToFunctions component and an OpenAIChatGenerator component that supports function calling. The payload is usually generated by the OpenAIChatGenerator.
Usage Exampleβ
Here's a complete YAML pipeline that demonstrates how to use OpenAPIServiceConnector with other components to create a function calling pipeline that:
- Converts OpenAPI specifications to function schemas using
OpenAPIServiceToFunctions - Uses an
OpenAIChatGeneratorto generate function calls - Connects to the OpenAPI service using
OpenAPIServiceConnector - Processes the response through output adapters
- Generates the final answer using another
OpenAIChatGenerator
components:
spec_to_functions:
type: haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions
init_parameters: {}
functions_llm:
type: haystack.components.generators.chat.openai.OpenAIChatGenerator
init_parameters:
api_key:
type: env_var
env_vars:
- OPENAI_API_KEY
strict: false
model: gpt-3.5-turbo-0613
openapi_container:
type: haystack.components.connectors.openapi_service.OpenAPIServiceConnector
init_parameters: {}
a1:
type: haystack.components.converters.output_adapter.OutputAdapter
init_parameters:
template: |
{
"tools": [{
"type": "function",
"function": {{ functions[0] }}
}],
"tool_choice": {
"type": "function",
"function": {"name": {{ functions[0].name }}}
}
}
output_type: Dict[str, Any]
custom_filters: {}
a2:
type: haystack.components.converters.output_adapter.OutputAdapter
init_parameters:
template: "{{specs[0]}}"
output_type: Dict[str, Any]
a3:
type: haystack.components.converters.output_adapter.OutputAdapter
init_parameters:
template: "{{system_message + service_response}}"
output_type: typing.List[haystack.dataclasses.ChatMessage]
llm:
type: haystack.components.generators.chat.openai.OpenAIChatGenerator
init_parameters:
api_key:
type: env_var
env_vars:
- OPENAI_API_KEY
strict: false
model: gpt-4-1106-preview
connections:
- sender: spec_to_functions.functions
receiver: a1.functions
- sender: spec_to_functions.openapi_specs
receiver: a2.specs
- sender: functions_llm.replies
receiver: openapi_container.messages
- sender: openapi_container.service_response
receiver: a3.service_response
- sender: a1.output
receiver: functions_llm.generation_kwargs
- sender: a2.output
receiver: openapi_container.service_openapi_spec
- sender: a3.output
receiver: llm.messages
inputs:
query: # These components will receive the query as input
- "functions_llm.messages"
- "spec_to_functions.sources"
filters: # These components will receive filters as input
- "openapi_container.service_credentials"
- "a3.system_message"
outputs: # Defines the output of your pipeline
answers: "llm.replies" # The output of the pipeline is the generated answer
service_response: "openapi_container.service_response" # The output from the OpenAPI service
max_runs_per_component: 100
metadata: {}
The pipeline fetches the OpenAPI specification and system prompt from the provided URLs and processes the user query about Sam Altman's ouster from OpenAI.
Parametersβ
Init Parametersβ
These are the parameters you can configure in Pipeline Builder:
| Parameter | Type | Default | Description |
|---|---|---|---|
| ssl_verify | Optional[Union[bool, str]] | None | Decide whether to use SSL verification for the requests. If a string is passed, itis used as the CA. |
Run Method Parametersβ
These are the parameters you can configure for the component's run() method. This means you can pass these parameters at query time through the API, in Playground, or when running a job. For details, see Modify Pipeline Parameters at Query Time.
| Parameter | Type | Default | Description |
|---|---|---|---|
| messages | List[ChatMessage] | A list of ChatMessage objects containing the messages to be processed. The last message should contain the tool calls. | |
| service_openapi_spec | Dict[str, Any] | The OpenAPI JSON specification object of the service to be invoked. All the refs should already be resolved. | |
| service_credentials | Optional[Union[dict, str]] | None | The credentials to be used for authentication with the service. Currently, only the http and apiKey OpenAPI security schemes are supported. |
Was this page helpful?