ACAI
ProductEvidenceDocsPricing
ACAI

Continuous compliance for AI. Every call scanned, classified, audit-logged, and evidence-ready.

Product

  • AI Layer
  • Sample Reports
  • Pricing
  • Documentation
  • Quickstart
  • Start Free

Company

  • About
  • Talk to an Engineer
  • Security
  • Support

Legal

  • Privacy Policy
  • Terms of Service
Service-Disabled Veteran-Owned Small Business
© 2026 Agile Cloud & AI LLC. All rights reserved.
OverviewQuick StartMigration GuideCompliance Quick StartNext Steps

User Guide

AuthenticationChat CompletionsEmbeddingsTranscriptionModelsGuardrailsRate LimitsError HandlingBYOK / Passthrough

Features

Batch APISemantic CacheRAGPromptsSmart RoutingRealtime APIAudit & Compliance

Developer

ArchitectureSelf-HostingAPI ReferenceInteractive DocsConfigurationContributing
Back to site

Migration Guide

Switch from OpenAI, Azure OpenAI, or Anthropic in 2 lines. ACAI is fully OpenAI-compatible — change the base URL and API key, keep everything else.

Why Migrate?

  • Compliance built in — PII redaction, audit logs, and compliance reports ship with every request. No extra middleware.
  • 37 models, one API key — Access OpenAI, Anthropic, Meta, Mistral, Cohere, and more through a single endpoint.
  • Zero code rewrite — OpenAI SDK compatible. Your existing code works as-is with a base-URL change.
  • Bring Your Own Key — Keep your existing provider keys and add ACAI's compliance layer on top.

From OpenAI

Change base_url and api_key. Nothing else changes.

Before (OpenAI direct)

from openai import OpenAI

client = OpenAI(api_key="sk-...")

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Summarize this claim"}],
)

After (ACAI)

from openai import OpenAI

client = OpenAI(
    base_url="https://api.agilecloud.ai/v1",  # ← changed
    api_key="dai-...",                          # ← changed
)

response = client.chat.completions.create(
    model="gpt-4o-mini",  # same model name
    messages=[{"role": "user", "content": "Summarize this claim"}],
)

From Azure OpenAI

Drop the AzureOpenAI client and deployment-name routing. Use standard model names.

Before (Azure OpenAI)

from openai import AzureOpenAI

client = AzureOpenAI(
    azure_endpoint="https://my-resource.openai.azure.com",
    api_key="...",
    api_version="2024-02-01",
)

response = client.chat.completions.create(
    model="my-gpt4-deployment",  # deployment name, not model
    messages=[{"role": "user", "content": "Hello"}],
)

After (ACAI)

from openai import OpenAI

client = OpenAI(
    base_url="https://api.agilecloud.ai/v1",
    api_key="dai-...",
)

response = client.chat.completions.create(
    model="gpt-4o",  # standard model name
    messages=[{"role": "user", "content": "Hello"}],
)

From Anthropic

Switch to the OpenAI SDK format. ACAI serves Claude models through the same OpenAI-compatible API.

Before (Anthropic SDK)

import anthropic

client = anthropic.Anthropic(api_key="sk-ant-...")

message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello"}],
)

After (ACAI)

from openai import OpenAI

client = OpenAI(
    base_url="https://api.agilecloud.ai/v1",
    api_key="dai-...",
)

response = client.chat.completions.create(
    model="claude-sonnet-4",  # Anthropic via ACAI
    messages=[{"role": "user", "content": "Hello"}],
)

JavaScript / TypeScript

import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://api.agilecloud.ai/v1",
  apiKey: "dai-...",
});

const response = await client.chat.completions.create({
  model: "gpt-4o-mini",
  messages: [{ role: "user", content: "Hello" }],
});

Keep Your Own Keys (BYOK)

Already have provider API keys? Register them in the Backends dashboard and ACAI proxies through them with full compliance applied. You pay your provider directly — ACAI charges only the compliance fee.

Environment Variable Cheat Sheet

# .env
OPENAI_API_KEY=dai-...
OPENAI_BASE_URL=https://api.agilecloud.ai/v1

Many frameworks (LangChain, LlamaIndex, Vercel AI SDK) respect these env vars automatically — no code changes required.

What You Get After Migration

PII Redaction

PHI and PII automatically detected and masked before reaching the model

Audit Logs

Every request logged with correlation IDs, token counts, and policy decisions

Compliance Reports

Generate HIPAA, SOC 2, and PCI DSS evidence exports on demand

Guardrails

Content safety, prompt injection detection, and custom rules enforced per-key

Cost Controls

Per-key spend limits and rate limiting prevent runaway bills

Model Routing

Switch models by changing a string — no infrastructure changes

Ready to switch?

Create your API key and start sending requests.

Get API KeyQuick Start