Call Claude Opus 4.7, Sonnet 4.6, and Haiku 4.5 from the OpenAI Python or JS SDK without installing the Anthropic client. Tool calling, vision, and prompt caching all pass through.
Anthropic's API has a slightly different shape than OpenAI's — different system prompt position, different tool schema, different streaming events. Most teams that started on OpenAI either install both SDKs and write an adapter, or block on a rewrite.
You don't have to. Point the OpenAI SDK at AIgateway and Claude is a model slug change.
from openai import OpenAI
client = OpenAI(
base_url="https://api.aigateway.sh/v1",
api_key="sk-aig-...",
)
resp = client.chat.completions.create(
model="anthropic/claude-opus-4.7",
messages=[
{"role": "system", "content": "You are a careful editor."},
{"role": "user", "content": "Tighten this paragraph: ..."},
],
)All three pass through. The OpenAI tool-call shape (name + arguments JSON) is translated to Anthropic's tool_use blocks server-side; the response comes back in OpenAI shape so your existing parser keeps working.
Vision: send image_url parts in the messages array exactly like you would for GPT-5.4. We translate to Anthropic's image content blocks.
JSON mode: response_format={"type": "json_object"} is honored.
Anthropic supports automatic prompt caching for messages above a length threshold. AIgateway proxies the cache_control hints through, and bills cache reads at 10% of the uncached cost. If your prompts repeat the same long context, cache hit rates over 30% are realistic.
If you've already written code against the Anthropic SDK, point it at our Anthropic-shape endpoint instead. Same models, same shape, no other changes.
import anthropic
client = anthropic.Anthropic(
base_url="https://api.aigateway.sh/anthropic",
api_key="sk-aig-...",
)
client.messages.create(
model="claude-opus-4.7",
max_tokens=1024,
messages=[{"role": "user", "content": "hi"}],
)