questions/Python

Aura-2-EN Python example?

Here's a minimal Python example calling Aura-2-EN through AIgateway's OpenAI-compatible API. Install `pip install aigateway-py openai`, create an sk-aig-... key at aigateway.sh, then instantiate the OpenAI client with base_url="https://api.aigateway.sh/v1" and call chat.completions.create with model="deepgram/aura-2-en". Works with streaming, tool calling, and async out of the box.

How it works

1. Install the SDKs

pip install aigateway-py openai. The OpenAI SDK handles chat, embeddings, images, and audio endpoints — which covers Aura-2-EN end-to-end. aigateway-py adds gateway-specific primitives: sub-account key minting, evals, replays, async jobs, and webhook signature verification.

2. Construct the client once, call the right endpoint

Use client.audio.speech.create with model="deepgram/aura-2-en". AIgateway matches OpenAI's request and response shape exactly, so autocompletion, type hints, and error handling all work.

3. Stream, batch, or tool-call — same API

Set stream=True for server-sent events. Pass tools=[...] for function calling. The OpenAI SDK's helpers (with_raw_response, with_streaming_response, runnable tools) all work against AIgateway unchanged because the wire format is identical.

Code example

Python
# pip install aigateway-py openai
from openai import OpenAI

client = OpenAI(
    base_url="https://api.aigateway.sh/v1",
    api_key="sk-aig-...",
)

audio = client.audio.speech.create(
    model="deepgram/aura-2-en",
    voice="alloy",
    input="Hello from AIgateway.",
)
audio.stream_to_file("out.mp3")
Node / TypeScript
import OpenAI from "openai";
import fs from "node:fs";

const client = new OpenAI({
  baseURL: "https://api.aigateway.sh/v1",
  apiKey: process.env.AIGATEWAY_KEY,
});

const audio = await client.audio.speech.create({
  model: "deepgram/aura-2-en",
  voice: "alloy",
  input: "Hello from AIgateway.",
});
const buf = Buffer.from(await audio.arrayBuffer());
fs.writeFileSync("out.mp3", buf);

Related

FAQ

What Python SDK should I use with Aura-2-EN?

The official OpenAI Python SDK (openai package) works unchanged. Pair with aigateway-py for sub-account key minting, evals, replays, and webhook signature verification.

How do I stream Aura-2-EN responses in Python?

Pass stream=True to chat.completions.create and iterate: for chunk in stream: print(chunk.choices[0].delta.content, end=""). Works identically to OpenAI.

Can I use async Python with Aura-2-EN?

Yes — import AsyncOpenAI from openai, await client.chat.completions.create(...). AIgateway supports concurrent connections and HTTP/2.

Does Aura-2-EN work with LangChain and LlamaIndex?

Yes. Use ChatOpenAI(base_url="https://api.aigateway.sh/v1", api_key="sk-aig-...", model="deepgram/aura-2-en"). LlamaIndex: OpenAI(api_base="https://api.aigateway.sh/v1", api_key="sk-aig-...", model="deepgram/aura-2-en").

TRY IT NOW

One key, every model. Free tier, no card.

Get an AIgateway keyOpen the playground