integrations/framework/LiteLLM
LL

LiteLLM + AIgateway

Use LiteLLM as a proxy upstream of AIgateway.

Many enterprises already route through LiteLLM. Configure AIgateway as an OpenAI-compatible provider in LiteLLM's config and you keep your existing routing logic while gaining 100+ models behind one upstream.

LiteLLM homepage →
Setup

Three steps or fewer.

STEP 01

Add to litellm config

Point a model entry at AIgateway with your key.

model_list:
  - model_name: kimi-k2.6
    litellm_params:
      model: openai/moonshot/kimi-k2.6
      api_base: https://api.aigateway.sh/v1
      api_key: os.environ/AIGATEWAY_API_KEY
More integrations

Same key. Every other tool.