Docs
The gateway speaks three protocols simultaneously: OpenAI ChatCompletion / Responses, Anthropic Messages, and OpenAI Images. Pick whichever your SDK uses and point baseURL at https://api.aiduct.ai.
OpenAI SDK — Chat Completions
Drop-in replacement. Change only baseURL.
from openai import OpenAI
client = OpenAI(
api_key="$AIDUCT_KEY",
base_url="https://api.aiduct.ai/v1",
)
resp = client.chat.completions.create(
model="anthropic/claude-sonnet-4-5",
messages=[{"role": "user", "content": "Hi"}],
)
print(resp.choices[0].message.content)Claude Code (Anthropic SDK)
Aiduct natively speaks the Anthropic Messages protocol at /v1/messages. Export the SDK env vars and your existing Claude Code workflow is metered through us.
# Claude Code (Anthropic SDK) export ANTHROPIC_API_KEY="$AIDUCT_KEY" export ANTHROPIC_BASE_URL="https://api.aiduct.ai" # Then just run claude-code as usual — every request is routed # through aiduct and metered against your aiduct.ai balance. claude-code
Anthropic SDK — Messages
Same protocol the Claude Code CLI uses, available to any client.
import anthropic
client = anthropic.Anthropic(
api_key="$AIDUCT_KEY",
base_url="https://api.aiduct.ai",
)
msg = client.messages.create(
model="anthropic/claude-sonnet-4-5",
max_tokens=1024,
messages=[{"role": "user", "content": "Hi"}],
)
print(msg.content[0].text)Codex CLI (OpenAI Responses)
Codex uses the newer /v1/responses endpoint. Aiduct proxies it transparently.
# OpenAI Codex CLI export OPENAI_API_KEY="$AIDUCT_KEY" export OPENAI_BASE_URL="https://api.aiduct.ai/v1" codex "refactor the auth flow to use better-auth"
cURL — Chat (streaming)
curl https://api.aiduct.ai/v1/chat/completions \
-H "Authorization: Bearer $AIDUCT_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "anthropic/claude-sonnet-4-5",
"messages": [{"role":"user","content":"Hi"}],
"stream": true
}'Node — OpenAI SDK
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.AIDUCT_KEY,
baseURL: "https://api.aiduct.ai/v1",
});
const result = await client.chat.completions.create({
model: "openai/gpt-4o-mini",
messages: [{ role: "user", content: "Hi" }],
});cURL — Responses API
The protocol the Codex CLI uses under the hood.
curl https://api.aiduct.ai/v1/responses \
-H "Authorization: Bearer $AIDUCT_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o",
"input": "Write a haiku about API gateways."
}'cURL — Image generation
curl https://api.aiduct.ai/v1/images/generations \
-H "Authorization: Bearer $AIDUCT_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "fal/flux-schnell",
"prompt": "a red panda",
"size": "1024x1024"
}'