To use Gemini 3 Flash, install the OpenAI SDK (any language), point base_url at https://api.aigateway.sh/v1 with an sk-aig-... key, and set model="google/gemini-3-flash". That's it — no provider account, no separate billing. Google's model is served through AIgateway's OpenAI-compatible endpoint so every existing OpenAI integration (Cursor, LangChain, Vercel AI SDK) works unchanged.
Sign in at aigateway.sh with GitHub or Google, open the dashboard, and copy an sk-aig-... key. No credit card required to claim a key — the free tier is 100 requests/day on Kimi K2.6 so you can test end-to-end before you upgrade.
Every OpenAI client library ships with a base_url (Python) or baseURL (Node) field. Set it to https://api.aigateway.sh/v1 and drop in your sk-aig-... key. Nothing else in your application code needs to change — the request and response schemas are OpenAI-identical.
On your chat.completions.create call (or images, embeddings, audio — whichever endpoint matches the text modality), pass model="google/gemini-3-flash". AIgateway handles provider auth, billing, retries, and transformation so Google-specific quirks don't leak into your code.
# pip install aigateway-py openai
# aigateway-py: sub-accounts, evals, replays, jobs, webhook verify.
# openai SDK: chat/embeddings/images/audio — drop-in compat per our SDK's own guidance.
from openai import OpenAI
client = OpenAI(
base_url="https://api.aigateway.sh/v1",
api_key="sk-aig-...",
)
r = client.chat.completions.create(
model="google/gemini-3-flash",
messages=[{"role": "user", "content": "Explain vector databases in two sentences."}],
)
print(r.choices[0].message.content)// npm i aigateway-js openai
// aigateway-js: sub-accounts, evals, replays, jobs, webhook verify.
// openai SDK: chat/embeddings/images/audio — drop-in compat.
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.aigateway.sh/v1",
apiKey: process.env.AIGATEWAY_KEY,
});
const r = await client.chat.completions.create({
model: "google/gemini-3-flash",
messages: [{ role: "user", content: "Explain vector databases in two sentences." }],
});
console.log(r.choices[0].message.content);No. AIgateway provides a unified sk-aig-... key that routes to Google on your behalf. You only need an AIgateway account.
Yes — Gemini 3 Flash supports OpenAI-standard function calling through the tools parameter on chat.completions.create. Parallel tool calls are supported.
Gemini 3 Flash supports a 1,000,000-token context window. Output is capped at 8,192 tokens per response.
AIgateway's free tier gives 100 requests/day on Kimi K2.6. Other models (including Gemini 3 Flash if it isn't Kimi K2.6) require a $5 top-up, then pass-through pricing.
Yes — anything that speaks OpenAI's API shape works. Set base_url to https://api.aigateway.sh/v1, paste your AIgateway key, and set the model string. Cursor, Cline, Continue, LangChain, LlamaIndex, Vercel AI SDK all supported.