Open WebUI is the most popular self-hosted LLM front-end (100k+ stars), commonly paired with Ollama. Because it also speaks the OpenAI protocol, AIgateway registers as a single OpenAI API connection and every model appears in the picker.
Pass OPENAI_API_BASE_URL and OPENAI_API_KEY. Open WebUI will treat AIgateway as the connection for everything under the OpenAI tab.
If you've already deployed Open WebUI, sign in as admin → Settings → Connections → OpenAI API. URL: https://api.aigateway.sh/v1 · Key: your AIgateway key. Save and the model list populates from /v1/models.
Open WebUI can run local Ollama models AND AIgateway side-by-side. Local inference for privacy-sensitive prompts; AIgateway for frontier models. The UI lets users pick per-thread.