Switch from OpenAI, Azure OpenAI, or Anthropic in 2 lines. ACAI is fully OpenAI-compatible — change the base URL and API key, keep everything else.
Change base_url and api_key. Nothing else changes.
Before (OpenAI direct)
from openai import OpenAI
client = OpenAI(api_key="sk-...")
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Summarize this claim"}],
)After (ACAI)
from openai import OpenAI
client = OpenAI(
base_url="https://api.agilecloud.ai/v1", # ← changed
api_key="dai-...", # ← changed
)
response = client.chat.completions.create(
model="gpt-4o-mini", # same model name
messages=[{"role": "user", "content": "Summarize this claim"}],
)Drop the AzureOpenAI client and deployment-name routing. Use standard model names.
Before (Azure OpenAI)
from openai import AzureOpenAI
client = AzureOpenAI(
azure_endpoint="https://my-resource.openai.azure.com",
api_key="...",
api_version="2024-02-01",
)
response = client.chat.completions.create(
model="my-gpt4-deployment", # deployment name, not model
messages=[{"role": "user", "content": "Hello"}],
)After (ACAI)
from openai import OpenAI
client = OpenAI(
base_url="https://api.agilecloud.ai/v1",
api_key="dai-...",
)
response = client.chat.completions.create(
model="gpt-4o", # standard model name
messages=[{"role": "user", "content": "Hello"}],
)Switch to the OpenAI SDK format. ACAI serves Claude models through the same OpenAI-compatible API.
Before (Anthropic SDK)
import anthropic
client = anthropic.Anthropic(api_key="sk-ant-...")
message = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello"}],
)After (ACAI)
from openai import OpenAI
client = OpenAI(
base_url="https://api.agilecloud.ai/v1",
api_key="dai-...",
)
response = client.chat.completions.create(
model="claude-sonnet-4", # Anthropic via ACAI
messages=[{"role": "user", "content": "Hello"}],
)import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.agilecloud.ai/v1",
apiKey: "dai-...",
});
const response = await client.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "user", content: "Hello" }],
});Already have provider API keys? Register them in the Backends dashboard and ACAI proxies through them with full compliance applied. You pay your provider directly — ACAI charges only the compliance fee.
# .env OPENAI_API_KEY=dai-... OPENAI_BASE_URL=https://api.agilecloud.ai/v1
Many frameworks (LangChain, LlamaIndex, Vercel AI SDK) respect these env vars automatically — no code changes required.
PII Redaction
PHI and PII automatically detected and masked before reaching the model
Audit Logs
Every request logged with correlation IDs, token counts, and policy decisions
Compliance Reports
Generate HIPAA, SOC 2, and PCI DSS evidence exports on demand
Guardrails
Content safety, prompt injection detection, and custom rules enforced per-key
Cost Controls
Per-key spend limits and rate limiting prevent runaway bills
Model Routing
Switch models by changing a string — no infrastructure changes
Ready to switch?
Create your API key and start sending requests.