Skip to content

Technical overview

The Opaque Confidential AI platform enables secure, scalable analytics on sensitive data by leveraging hardware-based confidential computing. The platform is structured into three distinct components, each deployed independently across two cloud environments:

  • Client provides the user-facing interface and APIs for interacting with the platform.
  • Control plane manages platform-wide operations, including coordination, state management, and access control.
  • Data plane executes analytics and AI workloads securely within trusted execution environments (TEEs).

The Opaque Confidential AI platform

Opaque client

The Opaque client serves as the entry point to the Opaque platform, offering two interfaces: a web application and a REST API.

The web application provides an intuitive graphical interface for managing sensitive data and computational tasks, or jobs. It interacts seamlessly with the REST API to enable secure operations. Users can choose to work through the browser-based interface or integrate the REST API into their own applications to develop custom workflows.

The REST API allows programmatic access to the platform, enabling users and applications to leverage its core capabilities, including:

  • Managing workspaces with cryptographic integrity guarantees.
  • Integrating encrypted data with configurable policies and securely sharing it with workspaces.
  • Proposing, approving, and submitting jobs for execution within trusted hardware environments.
  • Accessing cryptographically signed logs for verifiable user actions.
  • Retrieving or exporting encrypted job results.

For a complete list of available REST API features, see to the API reference documentation.

The client also handles critical security tasks, including encryption and decryption of data, key management, and integrity verification. These operations are managed by the data encryption service (DES), which:

  • Encrypts data before making it available it on the platform.
  • Decrypts job results for retrieval or export.
  • Uses platform-managed cryptographic keys (with support for customer-managed keys planned).

By enabling secure collaboration on sensitive data, the client ensures all user actions—such as approvals, data sharing, and job submissions—are authentic and tamper-proof. Due to its interaction with plaintext data and security-sensitive operations, the client must be deployed in a trusted environment.

Control plane

The Opaque control plane is the central state store and configuration hub for the Opaque platform. It manages the configuration of and shares policies with the data plane. Users interact with the control plane through the REST API, which integrates the following core services:

  • Management hub: Oversees workspaces, users, data sharing, jobs (authored, approved, rejected, submitted), and associated policies. It maintains metadata for jobs and datasets, such as data IDs and workspace associations, enforces job validity and integrity, and handles user-created digital signatures (although verification occurs in the data plane).
  • Key management service: Handles all data encryption keys securely using Hashicorp Vault. It supports platform-managed keys (with customer-managed key support coming soon) and operates within a confidential VM, ensuring keys remain inaccessible to Opaque and are only accessible via SSO-gated access tokens.
  • Audit logger: Tracks all user actions on the platform, such as workspace changes and job submissions. Each record is cryptographically signed with a secret key and stored securely. Users can retrieve and independently verify the audit trail using external signature verification tools.
  • Notification service: Delivers actionable updates to the GUI for workspace-related events, such as job reviews. It integrates with the management hub to provide users with timely information about relevant actions.

Together, these services ensure a secure and efficient control plane for managing the Opaque platform.

Data plane

The Opaque data plane is designed to securely execute workloads approved by workspace members, ensuring confidentiality and integrity in a distributed cluster environment. All jobs run within trusted execution environments (TEEs), guaranteeing data privacy, code integrity, and secure communication via attested TLS (aTLS). To maintain trust, the data plane performs integrity checks to ensure only approved and unaltered jobs are executed.

Its key components include the following:

  • Workload manager: Orchestrates the execution of jobs submitted by the control plane, communicates with the control plane via a message broker to identify jobs, and initiates workflows within the data plane to manage job execution.
  • Workflow controller: Manages workflows initiated by the workload manager, launches custom resources tailored to the specific workload requirements, and uses Argo Workflows for orchestration, enabling integration with diverse workloads.
  • Workload controller: A Kubernetes operator designed to support a variety of workloads, such as Spark, Ray, or Kubeflow. It monitors and reconciles workload resources launched by the workflow controller, ensuring flexibility to accommodate any workload supported by Kubernetes operators.

This architecture ensures robust, secure, and flexible execution of confidential workloads in the data plane.