Skip to content

Get started

Opaque services let you automate secure, low-latency data processing using built-in functions that run inside trusted execution environments (TEEs). Each service enforces strong privacy guarantees: both the input and computation remain confidential, even during processing.

This tutorial walks you through getting started with two built-in services:

  • Data ingestion securely fetches external data from sources like REST APIs.
  • Data redaction detects and masks PII within sensitive inputs.

You'll learn how to launch a service, configure your environment, and invoke it programmatically.

Before you begin

You’ll need access to an analytics and ML workspace and a local environment with Python 3.10+ and the Opaque Python SDK installed. If you haven’t set that up yet, start with the local setup guide.

You’ll also need the REST URL of your Opaque deployment to trigger the SDK. This is the API domain configured during deployment through Azure Marketplace. If you’re not sure where to find it, ask your workspace admin.

Step 1. Configure and launch services

  1. Go to Workspaces, select an analytics and ML workspace from the list, and open the Services tab. This tab lists all services created in the selected workspace. If none exist yet, the page will be blank.

    user_services_page.png

  2. Click New Service.

  3. On the New Service page:

    • Enter a name (max 50 characters).
    • Choose a Service Type (e.g., redaction or data ingestion).
    • (For redaction only) Add PII type to redact and/or custom fields to sanitize based on regular expressions.

    The New Service details page

    Enter a services name and choose a redaction type.

    Available redaction service PII types

    For redaction services, only, choose PII fields or add custom fields to sanitize.

    Note on custom redaction fields

    If you're adapting this workflow to your own redaction use cases, keep the following in mind:

    • The redaction service supports built-in PII types (e.g., email, phone number), but you can also define custom sanitization fields using regular expressions.
    • Custom regex is evaluated using Python’s re module. Patterns from other languages may behave differently in Python and can fail silently if incompatible.
    • To ensure compatibility, test your expressions using a Python-compatible regex tester such as regex101.com (with the Python flavor selected), or refer to the Python re module documentation for syntax guidance.
    • All custom fields are evaluated before built-in fields. If multiple custom fields are defined, they are applied in the order listed—fields higher in the list take priority.
    • Built-in fields are evaluated using an internal confidence-scoring mechanism. The field type with the highest confidence match is prioritized during redaction.
  4. Click Launch to deploy the service.

  5. Check the status of the service the Services list view.

Step 2. Prepare to run your service

With your service deployed, you’re ready to run secure workflows using the Opaque SDK.

Each workflow uses environment variables to authenticate with the Opaque platform and invoke services inside your workspace. Depending on your use case, you may choose to ingest external data, redact PII, or compose both steps into a policy-enforced pipeline.

  1. Collect required information.

    Before you proceed, gather the following required values —you’ll need these for each service you want to call:

    • Your API key: Click API keys in the left-hand nav.
    • Your workflow service UUID: Located at the top of your service’s details (Workspaces > Services > Service name).

      Find the workflow services UUID on the service’s details page.

      Find the workflow services UUID on the service’s details page.

    • The REST URL of your Opaque deployment: This is the API domain used when deploying Opaque via Azure Marketplace. Ask your workspace admin or deployment contact if you don’t have it.

  2. Configure your local environment.

    Switch to your terminal and set the following environment variables:

    export OPAQUE_API_KEY="<your_api_key>"    # From the API Keys section in Opaque
    export OPAQUE_REST_URL="<your_rest_url>"  # Must include the API version number
    

    This sets up your local environment to securely invoke services through the SDK.

  3. Submit a request.

    Once your environment is configured, you can submit a request using a Python script.

    The following Python script shows how to invoke two services—data ingestion and PII redaction—in a secure, programmatic flow using the Opaque SDK.

    This example:

    • Retrieves data from a public API.
    • Sends it to the ingestion service.
    • Pipes the result into a redaction service.

    You can adapt this script to use your own endpoints and redaction logic.

    import os
    import uuid
    from opaque.ingestion import HttpIngestorRequest, HttpMethod, IngestorService
    from opaque.redaction import RedactionService
    
    # Set up required environment variables (already exported in your shell)
    INGESTION_SERVICE_UUID = "your-ingestion-service-uuid"
    REDACTION_SERVICE_UUID = "your-redaction-service-uuid"
    
    # External source to ingest (e.g., public API or file gateway)
    INGESTOR_URL = "https://dummyjson.com/users"
    
    # Initialize service clients
    ingestor_client = IngestorService(service_uuid=uuid.UUID(INGESTION_SERVICE_UUID))
    redaction_client = RedactionService(service_uuid=uuid.UUID(REDACTION_SERVICE_UUID))
    
    # Step 1: Ingest external data
    ingestion_request = HttpIngestorRequest(
        url=INGESTOR_URL,
        headers={},
        body="",
        method=HttpMethod.GET,
    )
    ingestion_response = ingestor_client.ingest(inputs=[ingestion_request])
    print("Raw output:", ingestion_response.texts)
    
    # Step 2: Redact PII from response
    redact_response = redaction_client.redact(input_texts=ingestion_response.texts)
    print("Redacted output:", redact_response)
    

With your environment set up and your service launched, you're ready to integrate secure, low-latency data processing into your AI workflows.