OpenAI-compatible at the endpoint means every SDK, framework, and coding agent in the ecosystem routes through us without a rewrite. Copy the snippet, change one string, ship.
Our own SDKs, CLI, and MCP server. Install one and reach for the aggregator-native surface OpenAI doesn't model.
pip install aigateway-py
from aigateway import AIgateway
ai = AIgateway(api_key="sk-aig-...")
job = ai.jobs.create_video(
model="runwayml/gen-4",
prompt="a sunset over mountains",
)
done = ai.jobs.wait(job.id)
print(done.result_url)pnpm add aigateway-js
import { AIgateway } from "aigateway-js";
const ai = new AIgateway({ apiKey: process.env.AIGATEWAY_API_KEY! });
const acct = await ai.subAccounts.create({
name: "acme-corp",
spendCapCents: 50_000,
});
console.log(acct.key);npm i -g aigateway-cli # or: npx aigateway-cli init aig init # walks through key + scaffolds starter aig call moonshot/kimi-k2.6 "explain edge inference" aig sub-account create acme --cap 50000 --rpm 300 aig mcp tools # inspect MCP from the shell
// MCP host config — Streamable HTTP (preferred)
{
"mcpServers": {
"aigateway": {
"type": "http",
"url": "https://api.aigateway.sh/mcp",
"headers": { "Authorization": "Bearer sk-aig-..." }
}
}
}
// Inspect tools at: https://api.aigateway.sh/mcp/inspectUse the OpenAI / Anthropic / ai-sdk clients you already have. Set base URL, ship.
from openai import OpenAI
client = OpenAI(
base_url="https://api.aigateway.sh/v1",
api_key="sk-aig-...",
)
r = client.chat.completions.create(
model="anthropic/claude-opus-4.7",
messages=[{"role":"user","content":"hi"}],
)import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";
const aig = createOpenAI({
baseURL: "https://api.aigateway.sh/v1",
apiKey: process.env.AIGATEWAY_KEY,
});
const { text } = await generateText({
model: aig("anthropic/claude-opus-4.7"),
prompt: "Hello",
});import anthropic
client = anthropic.Anthropic(
base_url="https://api.aigateway.sh/anthropic",
api_key="sk-aig-...",
)
msg = client.messages.create(
model="claude-opus-4.7",
max_tokens=1024,
messages=[{"role":"user","content":"hi"}],
)Every orchestration framework that speaks OpenAI speaks us.
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://api.aigateway.sh/v1",
api_key="sk-aig-...",
model="anthropic/claude-opus-4.7",
)
print(llm.invoke("hi").content)from llama_index.llms.openai import OpenAI
from llama_index.core import Settings
Settings.llm = OpenAI(
api_base="https://api.aigateway.sh/v1",
api_key="sk-aig-...",
model="openai/gpt-5.4",
)import { Agent } from "@mastra/core/agent";
import { openai } from "@ai-sdk/openai";
const agent = new Agent({
name: "researcher",
model: openai("anthropic/claude-opus-4.7", {
baseURL: "https://api.aigateway.sh/v1",
}),
});n8n → Credentials → OpenAI API Base URL: https://api.aigateway.sh/v1 API key: sk-aig-...
Point your IDE or terminal agent at one URL to unlock every model.
# ~/.claude/settings.json (or env) export ANTHROPIC_BASE_URL="https://api.aigateway.sh/anthropic" export ANTHROPIC_API_KEY="sk-aig-..." # now use any Anthropic model via the gateway
# Cursor → Settings → Models → Custom
Base URL: https://api.aigateway.sh/v1
API key: sk-aig-...
Models: anthropic/claude-opus-4.7,
openai/gpt-5.4,
moonshot/kimi-k2.6{
"models": [{
"title": "AIgateway",
"provider": "openai",
"apiBase": "https://api.aigateway.sh/v1",
"apiKey": "sk-aig-...",
"model": "anthropic/claude-opus-4.7"
}]
}Provider: OpenAI Compatible Base URL: https://api.aigateway.sh/v1 API key: sk-aig-... Model ID: anthropic/claude-opus-4.7
openclaw config add aigateway \ --llms-txt https://aigateway.sh/llms.txt \ --api-key sk-aig-...
Since we're OpenAI-compatible, almost every tool already works — we just haven't written down the exact one-line change yet. Email the tool name and we usually ship a doc within 24 hours.