integrations/chat ui/LobeChat
LC

LobeChat + AIgateway

Self-hosted ChatGPT clone — one env var and every AIgateway model shows up.

LobeChat is a popular open-source chat UI (55k+ stars) that runs anywhere Docker does. Because it consumes any OpenAI-compatible endpoint, AIgateway plugs in with two environment variables — no plugins, no forks.

LobeChat homepage →
Setup

Three steps or fewer.

STEP 01

Docker — set the proxy URL and key

Pass OPENAI_PROXY_URL and OPENAI_API_KEY when you start the container. LobeChat routes every OpenAI-provider call through AIgateway.

docker run -d --name lobe-chat -p 3210:3210 \
  -e OPENAI_PROXY_URL="https://api.aigateway.sh/v1" \
  -e OPENAI_API_KEY="sk-aig-..." \
  -e OPENAI_MODEL_LIST="+anthropic/claude-opus-4.7,+moonshot/kimi-k2.6,+openai/gpt-5.4" \
  lobehub/lobe-chat
STEP 02

Or configure in the UI

Settings → AI Service Provider → OpenAI. Flip on Use Custom API Endpoint, paste https://api.aigateway.sh/v1 and your AIgateway key, then add models using their provider/slug form (e.g. moonshot/kimi-k2.6). Changes persist per-browser.

STEP 03

Kubernetes / docker-compose

Same two env vars via your compose file or Helm values. No sidecar, no webhook rewrite.

# docker-compose.yml
services:
  lobe-chat:
    image: lobehub/lobe-chat
    ports: ["3210:3210"]
    environment:
      OPENAI_PROXY_URL: https://api.aigateway.sh/v1
      OPENAI_API_KEY: ${AIGATEWAY_API_KEY}
      OPENAI_MODEL_LIST: "+anthropic/claude-opus-4.7,+moonshot/kimi-k2.6,+openai/gpt-5.4"
Notes
  • LobeChat's model ID format accepts any string, so provider/slug IDs like moonshot/kimi-k2.6 pass straight through.
  • Vision, function calling, and JSON mode work because AIgateway forwards the full OpenAI schema.
  • For multi-tenant LobeChat deployments, mint scoped sub-account keys from AIgateway so each user gets per-key spend caps.
More integrations

Same key. Every other tool.