Here's a minimal Python example calling Aura-2-EN through AIgateway's OpenAI-compatible API. Install `pip install aigateway-py openai`, create an sk-aig-... key at aigateway.sh, then instantiate the OpenAI client with base_url="https://api.aigateway.sh/v1" and call chat.completions.create with model="deepgram/aura-2-en". Works with streaming, tool calling, and async out of the box.
pip install aigateway-py openai. The OpenAI SDK handles chat, embeddings, images, and audio endpoints — which covers Aura-2-EN end-to-end. aigateway-py adds gateway-specific primitives: sub-account key minting, evals, replays, async jobs, and webhook signature verification.
Use client.audio.speech.create with model="deepgram/aura-2-en". AIgateway matches OpenAI's request and response shape exactly, so autocompletion, type hints, and error handling all work.
Set stream=True for server-sent events. Pass tools=[...] for function calling. The OpenAI SDK's helpers (with_raw_response, with_streaming_response, runnable tools) all work against AIgateway unchanged because the wire format is identical.
# pip install aigateway-py openai
from openai import OpenAI
client = OpenAI(
base_url="https://api.aigateway.sh/v1",
api_key="sk-aig-...",
)
audio = client.audio.speech.create(
model="deepgram/aura-2-en",
voice="alloy",
input="Hello from AIgateway.",
)
audio.stream_to_file("out.mp3")import OpenAI from "openai";
import fs from "node:fs";
const client = new OpenAI({
baseURL: "https://api.aigateway.sh/v1",
apiKey: process.env.AIGATEWAY_KEY,
});
const audio = await client.audio.speech.create({
model: "deepgram/aura-2-en",
voice: "alloy",
input: "Hello from AIgateway.",
});
const buf = Buffer.from(await audio.arrayBuffer());
fs.writeFileSync("out.mp3", buf);The official OpenAI Python SDK (openai package) works unchanged. Pair with aigateway-py for sub-account key minting, evals, replays, and webhook signature verification.
Pass stream=True to chat.completions.create and iterate: for chunk in stream: print(chunk.choices[0].delta.content, end=""). Works identically to OpenAI.
Yes — import AsyncOpenAI from openai, await client.chat.completions.create(...). AIgateway supports concurrent connections and HTTP/2.
Yes. Use ChatOpenAI(base_url="https://api.aigateway.sh/v1", api_key="sk-aig-...", model="deepgram/aura-2-en"). LlamaIndex: OpenAI(api_base="https://api.aigateway.sh/v1", api_key="sk-aig-...", model="deepgram/aura-2-en").