To use Aura-2-EN, install the OpenAI SDK (any language), point base_url at https://api.aigateway.sh/v1 with an sk-aig-... key, and set model="deepgram/aura-2-en". That's it — no provider account, no separate billing. Deepgram's model is served through AIgateway's OpenAI-compatible endpoint so every existing OpenAI integration (Cursor, LangChain, Vercel AI SDK) works unchanged.
Sign in at aigateway.sh with GitHub or Google, open the dashboard, and copy an sk-aig-... key. No credit card required to claim a key — the free tier is 100 requests/day on Kimi K2.6 so you can test end-to-end before you upgrade.
Every OpenAI client library ships with a base_url (Python) or baseURL (Node) field. Set it to https://api.aigateway.sh/v1 and drop in your sk-aig-... key. Nothing else in your application code needs to change — the request and response schemas are OpenAI-identical.
On your chat.completions.create call (or images, embeddings, audio — whichever endpoint matches the audio-tts modality), pass model="deepgram/aura-2-en". AIgateway handles provider auth, billing, retries, and transformation so Deepgram-specific quirks don't leak into your code.
# pip install aigateway-py openai
from openai import OpenAI
client = OpenAI(
base_url="https://api.aigateway.sh/v1",
api_key="sk-aig-...",
)
audio = client.audio.speech.create(
model="deepgram/aura-2-en",
voice="alloy",
input="Hello from AIgateway.",
)
audio.stream_to_file("out.mp3")import OpenAI from "openai";
import fs from "node:fs";
const client = new OpenAI({
baseURL: "https://api.aigateway.sh/v1",
apiKey: process.env.AIGATEWAY_KEY,
});
const audio = await client.audio.speech.create({
model: "deepgram/aura-2-en",
voice: "alloy",
input: "Hello from AIgateway.",
});
const buf = Buffer.from(await audio.arrayBuffer());
fs.writeFileSync("out.mp3", buf);No. AIgateway provides a unified sk-aig-... key that routes to Deepgram on your behalf. You only need an AIgateway account.
Aura-2-EN does not support tool calling natively. Use it for generation; route tool-calling prompts to a model like Claude Sonnet 4.6 or GPT-5.4.
Context window details for Aura-2-EN are on the spec page at /models/deepgram/aura-2-en.
AIgateway's free tier gives 100 requests/day on Kimi K2.6. Other models (including Aura-2-EN if it isn't Kimi K2.6) require a $5 top-up, then pass-through pricing.
Yes — anything that speaks OpenAI's API shape works. Set base_url to https://api.aigateway.sh/v1, paste your AIgateway key, and set the model string. Cursor, Cline, Continue, LangChain, LlamaIndex, Vercel AI SDK all supported.