questions/How-to

How do I use Gemini 3 Flash?

To use Gemini 3 Flash, install the OpenAI SDK (any language), point base_url at https://api.aigateway.sh/v1 with an sk-aig-... key, and set model="google/gemini-3-flash". That's it — no provider account, no separate billing. Google's model is served through AIgateway's OpenAI-compatible endpoint so every existing OpenAI integration (Cursor, LangChain, Vercel AI SDK) works unchanged.

How it works

1. Grab an AIgateway key

Sign in at aigateway.sh with GitHub or Google, open the dashboard, and copy an sk-aig-... key. No credit card required to claim a key — the free tier is 100 requests/day on Kimi K2.6 so you can test end-to-end before you upgrade.

2. Point the OpenAI SDK at AIgateway

Every OpenAI client library ships with a base_url (Python) or baseURL (Node) field. Set it to https://api.aigateway.sh/v1 and drop in your sk-aig-... key. Nothing else in your application code needs to change — the request and response schemas are OpenAI-identical.

3. Set model="google/gemini-3-flash"

On your chat.completions.create call (or images, embeddings, audio — whichever endpoint matches the text modality), pass model="google/gemini-3-flash". AIgateway handles provider auth, billing, retries, and transformation so Google-specific quirks don't leak into your code.

Code example

Python
# pip install aigateway-py openai
# aigateway-py: sub-accounts, evals, replays, jobs, webhook verify.
# openai SDK: chat/embeddings/images/audio — drop-in compat per our SDK's own guidance.
from openai import OpenAI

client = OpenAI(
    base_url="https://api.aigateway.sh/v1",
    api_key="sk-aig-...",
)

r = client.chat.completions.create(
    model="google/gemini-3-flash",
    messages=[{"role": "user", "content": "Explain vector databases in two sentences."}],
)
print(r.choices[0].message.content)
Node / TypeScript
// npm i aigateway-js openai
// aigateway-js: sub-accounts, evals, replays, jobs, webhook verify.
// openai SDK: chat/embeddings/images/audio — drop-in compat.
import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://api.aigateway.sh/v1",
  apiKey: process.env.AIGATEWAY_KEY,
});

const r = await client.chat.completions.create({
  model: "google/gemini-3-flash",
  messages: [{ role: "user", content: "Explain vector databases in two sentences." }],
});
console.log(r.choices[0].message.content);

Related

FAQ

Do I need a Google account to use Gemini 3 Flash?

No. AIgateway provides a unified sk-aig-... key that routes to Google on your behalf. You only need an AIgateway account.

Does Gemini 3 Flash support tool calling?

Yes — Gemini 3 Flash supports OpenAI-standard function calling through the tools parameter on chat.completions.create. Parallel tool calls are supported.

What is the context window of Gemini 3 Flash?

Gemini 3 Flash supports a 1,000,000-token context window. Output is capped at 8,192 tokens per response.

Is Gemini 3 Flash free to try?

AIgateway's free tier gives 100 requests/day on Kimi K2.6. Other models (including Gemini 3 Flash if it isn't Kimi K2.6) require a $5 top-up, then pass-through pricing.

Can I use Gemini 3 Flash with Cursor or LangChain?

Yes — anything that speaks OpenAI's API shape works. Set base_url to https://api.aigateway.sh/v1, paste your AIgateway key, and set the model string. Cursor, Cline, Continue, LangChain, LlamaIndex, Vercel AI SDK all supported.

TRY IT NOW

One key, every model. Free tier, no card.

Get an AIgateway keyOpen the playground