Integrations

Works with what you're already using.
Setup is a base URL.

OpenAI-compatible at the endpoint means every SDK, framework, and coding agent in the ecosystem routes through us without a rewrite. Copy the snippet, change one string, ship.

Read the docsFor coding agents
Official packages

Official packages

Our own SDKs, CLI, and MCP server. Install one and reach for the aggregator-native surface OpenAI doesn't model.

aigateway-py · Python SDK
PYTHON
Async jobs, sub-accounts, evals, replays, signed URLs, webhook verification.
pip install aigateway-py

from aigateway import AIgateway
ai = AIgateway(api_key="sk-aig-...")

job = ai.jobs.create_video(
    model="runwayml/gen-4",
    prompt="a sunset over mountains",
)
done = ai.jobs.wait(job.id)
print(done.result_url)
Full setup →
aigateway-js · Node SDK
TS
Same surface in TypeScript. ESM + CJS, zero runtime deps, Node 18+.
pnpm add aigateway-js

import { AIgateway } from "aigateway-js";
const ai = new AIgateway({ apiKey: process.env.AIGATEWAY_API_KEY! });

const acct = await ai.subAccounts.create({
  name: "acme-corp",
  spendCapCents: 50_000,
});
console.log(acct.key);
Full setup →
aigateway-cli · `aig`
BASH
Terminal-native — `aig init`, `aig call`, `aig models`, `aig mcp call`.
npm i -g aigateway-cli   # or: npx aigateway-cli init

aig init                                          # walks through key + scaffolds starter
aig call moonshot/kimi-k2.6 "explain edge inference"
aig sub-account create acme --cap 50000 --rpm 300
aig mcp tools                                     # inspect MCP from the shell
Full setup →
MCP (Model Context Protocol)
JSON
Add AIgateway as an MCP server in Claude Code, Cursor agents, any MCP host.
// MCP host config — Streamable HTTP (preferred)
{
  "mcpServers": {
    "aigateway": {
      "type": "http",
      "url": "https://api.aigateway.sh/mcp",
      "headers": { "Authorization": "Bearer sk-aig-..." }
    }
  }
}
// Inspect tools at: https://api.aigateway.sh/mcp/inspect
Full setup →
Drop-in SDKs

Drop-in SDKs

Use the OpenAI / Anthropic / ai-sdk clients you already have. Set base URL, ship.

OpenAI SDK (any language)
PYTHON
Change the base URL. That's the whole integration.
from openai import OpenAI
client = OpenAI(
    base_url="https://api.aigateway.sh/v1",
    api_key="sk-aig-...",
)
r = client.chat.completions.create(
    model="anthropic/claude-opus-4.7",
    messages=[{"role":"user","content":"hi"}],
)
Full setup →
Vercel AI SDK
TS
Route any provider through ai-sdk with one import.
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";

const aig = createOpenAI({
  baseURL: "https://api.aigateway.sh/v1",
  apiKey: process.env.AIGATEWAY_KEY,
});

const { text } = await generateText({
  model: aig("anthropic/claude-opus-4.7"),
  prompt: "Hello",
});
Full setup →
Anthropic SDK
PYTHON
Official Anthropic client. Swap base URL, keep everything else.
import anthropic
client = anthropic.Anthropic(
    base_url="https://api.aigateway.sh/anthropic",
    api_key="sk-aig-...",
)
msg = client.messages.create(
    model="claude-opus-4.7",
    max_tokens=1024,
    messages=[{"role":"user","content":"hi"}],
)
Full setup →
Frameworks

Frameworks

Every orchestration framework that speaks OpenAI speaks us.

LangChain
PYTHON
Every ChatOpenAI / ChatAnthropic / embeddings binding works unchanged.
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.aigateway.sh/v1",
    api_key="sk-aig-...",
    model="anthropic/claude-opus-4.7",
)
print(llm.invoke("hi").content)
Full setup →
LlamaIndex
PYTHON
OpenAI-compatible means every Settings.llm binding just works.
from llama_index.llms.openai import OpenAI
from llama_index.core import Settings

Settings.llm = OpenAI(
    api_base="https://api.aigateway.sh/v1",
    api_key="sk-aig-...",
    model="openai/gpt-5.4",
)
Full setup →
Mastra
TS
Agent framework — point models at the gateway, tools route automatically.
import { Agent } from "@mastra/core/agent";
import { openai } from "@ai-sdk/openai";

const agent = new Agent({
  name: "researcher",
  model: openai("anthropic/claude-opus-4.7", {
    baseURL: "https://api.aigateway.sh/v1",
  }),
});
Full setup →
n8n
INI
OpenAI credential → any model across every node.
n8n → Credentials → OpenAI API
Base URL:  https://api.aigateway.sh/v1
API key:   sk-aig-...
Full setup →
Coding agents

Coding agents

Point your IDE or terminal agent at one URL to unlock every model.

Claude Code
BASH
Set ANTHROPIC_BASE_URL + ANTHROPIC_API_KEY. Done.
# ~/.claude/settings.json (or env)
export ANTHROPIC_BASE_URL="https://api.aigateway.sh/anthropic"
export ANTHROPIC_API_KEY="sk-aig-..."
# now use any Anthropic model via the gateway
Full setup →
Cursor
INI
Settings → Models → Custom OpenAI endpoint.
# Cursor → Settings → Models → Custom
Base URL:  https://api.aigateway.sh/v1
API key:   sk-aig-...
Models:    anthropic/claude-opus-4.7,
           openai/gpt-5.4,
           moonshot/kimi-k2.6
Full setup →
Continue
JSON
config.json — one provider block routes every model.
{
  "models": [{
    "title": "AIgateway",
    "provider": "openai",
    "apiBase": "https://api.aigateway.sh/v1",
    "apiKey": "sk-aig-...",
    "model": "anthropic/claude-opus-4.7"
  }]
}
Full setup →
Cline
INI
OpenAI Compatible provider → point at the gateway.
Provider:  OpenAI Compatible
Base URL:  https://api.aigateway.sh/v1
API key:   sk-aig-...
Model ID:  anthropic/claude-opus-4.7
Full setup →
OpenClaw
BASH
Autoconfigure from llms.txt — one command.
openclaw config add aigateway \
  --llms-txt https://aigateway.sh/llms.txt \
  --api-key sk-aig-...
Full setup →
Missing one

Tell us and we'll write the guide.

Since we're OpenAI-compatible, almost every tool already works — we just haven't written down the exact one-line change yet. Email the tool name and we usually ship a doc within 24 hours.

Request an integrationAgent autoconfigure