Use Moonshot's Kimi K2.6 (open frontier agent model, 256K context, vision, native tool calling) from the OpenAI SDK by changing only the base_url. Free at 100 req/day through Apr 30.
Kimi K2.6 is Moonshot's frontier open agent model — 256K context, native tool calling, vision input. The Anthropic Sonnet-class output quality at a fraction of the cost.
AIgateway hosts it behind an OpenAI-compatible endpoint. So instead of installing a Moonshot SDK, integrating their auth flow, or porting your tool-call schema, you change one line in your existing OpenAI SDK code.
The only difference from a normal OpenAI client is the base_url and the model slug. Streaming, tool calling, vision, JSON mode all work the same.
from openai import OpenAI
client = OpenAI(
base_url="https://api.aigateway.sh/v1",
api_key="sk-aig-...",
)
stream = client.chat.completions.create(
model="moonshot/kimi-k2.6",
messages=[
{"role": "system", "content": "You are a research planner."},
{"role": "user", "content": "Plan a 4-step research agent."},
],
tools=my_tools,
stream=True,
)
for chunk in stream:
print(chunk.choices[0].delta.content or "", end="")Same swap on the JS side — change the baseURL, change the model slug.
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.aigateway.sh/v1",
apiKey: process.env.AIGATEWAY_API_KEY,
});
const stream = await client.chat.completions.create({
model: "moonshot/kimi-k2.6",
messages: [{ role: "user", content: "hello" }],
stream: true,
});Through Apr 30, 2026, every AIgateway account gets 100 Kimi K2.6 requests/day with no card on file. After that the same model bills pass-through with the rest of the catalog.
Get your key at aigateway.sh/signin and use the snippet above. If you want to see how it stacks against Opus 4.7 or GPT-5.4 on your own data, run an eval — point your dataset at all three and pick the winner.