Many enterprises already route through LiteLLM. Configure AIgateway as an OpenAI-compatible provider in LiteLLM's config and you keep your existing routing logic while gaining 100+ models behind one upstream.
Point a model entry at AIgateway with your key.
model_list:
- model_name: kimi-k2.6
litellm_params:
model: openai/moonshot/kimi-k2.6
api_base: https://api.aigateway.sh/v1
api_key: os.environ/AIGATEWAY_API_KEY