integrations/chat ui/LibreChat

LibreChat + AIgateway

YAML config → every AIgateway model available as a LibreChat endpoint.

LibreChat (20k+ stars) is a self-hosted multi-provider chat UI. Its librechat.yaml supports custom OpenAI-compatible endpoints — add AIgateway once and every model in the catalog is selectable from the model picker.

LibreChat homepage →
Setup

Three steps or fewer.

STEP 01

Add a custom endpoint to librechat.yaml

LibreChat loads librechat.yaml on boot. Add AIgateway as a custom endpoint with your key and base URL. Set fetch: true and LibreChat will pull the live model list from /v1/models.

# librechat.yaml
version: 1.0.5
endpoints:
  custom:
    - name: "AIgateway"
      apiKey: "${AIGATEWAY_API_KEY}"
      baseURL: "https://api.aigateway.sh/v1"
      models:
        default: ["anthropic/claude-opus-4.7", "openai/gpt-5.4", "moonshot/kimi-k2.6"]
        fetch: true
      titleConvo: true
      titleModel: "openai/gpt-5.4-mini"
      summarize: false
      modelDisplayLabel: "AIgateway"
      iconURL: "https://aigateway.sh/brand/mark.svg"
STEP 02

Export the key + restart

Put AIGATEWAY_API_KEY in your .env (or whatever secret store LibreChat reads), then restart the container.

echo 'AIGATEWAY_API_KEY=sk-aig-...' >> .env
docker compose restart api
STEP 03

Pick AIgateway from the model dropdown

AIgateway will appear as a new endpoint. Every model from /v1/models shows up, streaming and tool-calling work unchanged.

Notes
  • fetch: true lets LibreChat auto-discover new models as you add them to AIgateway — no yaml updates needed per model.
  • titleConvo uses a cheap model to auto-title threads. Point it at openai/gpt-5.4-mini or moonshot/kimi-k2.6 to keep costs down.
  • Works against LibreChat's Docker, compose, and bare-metal deployments. The yaml path is identical.
More integrations

Same key. Every other tool.