Skip to content

Use integrations

Integrations in Opaque are preconfigured, reusable components that can be shared across workspaces and workflows. They let teams securely connect to external systems (like LLMs or data sources) without having to re-enter credentials or config details every time.

Think of integrations as ready-to-use building blocks: define them once, and make them available for anyone in your workspace to use in workflows.

Supported integration types

Opaque currently supports two types of integrations:

  • Data connectors (retrievers): Connect to external data sources so agents can retrieve structured or unstructured content at runtime.

    Example: An Azure AI Search Retriever integration that queries an enterprise search index.

  • LLMs (language models): Connect to large language models for text processing, analysis, or generation.

    Examples: An OpenAI Service integration or a vLLM Service integration.

Create a new integration

To create a new integration:

  1. Go to the Integrations page in the Opaque and click New Integration.

    Go to the Integrations page to create a new integration.

    Go to the Integrations page to create a new integration.

  2. Choose the integration type—either a Data connector or an LLM—and click Next.

    • Data connector: Create when you want agents to pull information from an external data source (currently Azure AI Search).
    • LLM: Create when you want agents to use a large language model (OpenAI or vLLM).

    Choose an integration type.

    Choose an integration type.

  3. Select the agent for this integration.

    ... the available option is Azure AI Search Retriever.

    For data connectors, your option is currently limited to Azure AI Search Retriever.

    For data connectors, your option is currently limited to Azure AI Search Retriever.

    ... you can configure either OpenAI Service or vLLM Service.

    When choosing an LLM configuration, you can choose between an OpenAI or vLLM agent.

    When choosing an LLM configuration, you can choose between an OpenAI or vLLM agent.

  4. Configure the agent. Give it a clear name and description so teammates can quickly recognize its purpose. Each integration type includes fields specific to its role.

    Name integrations clearly

    Choose names and descriptions that make sense to others in your organization. For example, use “Customer Search Index” instead of “Azure Test 1.” Clear naming helps teammates recognize and reuse the right integration without confusion.

    When configuring Azure AI Search Retriever, you can supply the following fields:

    • API key: Usually required to connect securely.
    • API version: Optional; defaults to the latest supported version.
    • Search index name: Typically required to query your data.
    • Search service name: Optional; identifies the Azure Search service hosting your index.

    An example configuration of an Azure AI Search Retriever.

    An example configuration of an Azure AI Search Retriever.

    When configuring an LLM agent, you’ll want to specify:

    • Model name
    • Temperature
    • Maximum number of tokens
    • API key/URL (depending on the service)
    • Context prompt

    Any fields you pre-specify here will be locked and uneditable for downstream users.

    An example configuration of an OpenAI agent.

    An example configuration of an OpenAI agent.

  5. Click Save. The new integration appears on the Integrations page.

    The Integrations page lists all preconfigured agents that are available to your organization.

    The Integrations page lists all preconfigured agents that are available to your organization.

Share an integration with workspaces

Once created, an integration can be shared with one or more workspaces so it’s available in their workflows. Follow these steps:

  1. On the Integrations page, select the integration from the list and click Share with workspaces. You’ll need to be the integration’s creator or a workspace admin to share it.

    Select the integration and click Share with workspaces.

    Select the integration and click Share with workspaces.

  2. Choose one or more workspaces and click Share integration.

    Select the workspace(s) and click Share integration.

    Select the workspace(s) and click Share integration.

You’ll see a confirmation notification that your new integration has been shared with the selected workspace(s).

A confirmation notification indicates when an integration has been shared successfully.

To confirm, go to the target workspace and open the Integrations tab. You should see the shared integration listed.

Shared integrations are listed on the workspace's Integrations tab.

Click the more () icon, to open a drawer that shows the agent’s configuration details.

Screenshot 2025-09-08 at 1.08.25 AM copyb.png

Once shared, integrations appear in the Nodes panel of your workflow, grouped by type (data connectors or LLMs). From there, they behave just like agents you configure yourself — you can connect them in workflows, define prompts, and add guardrails as needed.

Shared integrations appear in the Nodes panel under their type: data connectors or LLMs.

Shared integrations appear in the Nodes panel under their type: data connectors or LLMs.

Note

Any fields set by the integration creator are locked and cannot be edited in the workflow