LiteLLM

tool active open-source

A unified interface for calling 100+ LLM providers using the OpenAI API format. LiteLLM acts as a translation layer—write code once using the OpenAI format and switch between providers by changing the model name.

Implements

Concepts this tool claims to implement:

  • Proxy server that routes requests to different LLM providers. Fallback, load balancing, and spend tracking.

  • OpenAI API primary

    Core value prop: call any LLM using OpenAI's API format. Handles translation to each provider's native format.

  • Model Router secondary

    Route requests to different models based on config. Fallback chains when primary model fails.

Integration Surfaces

  • Python SDK
  • Proxy server
  • OpenAI-compatible API

Details

Vendor
BerriAI
License
MIT
Runs On
local, cloud
Used By
system

Notes

LiteLLM is invaluable for multi-provider setups. Support 100+ providers including OpenAI, Anthropic, Cohere, local models, and more. The proxy mode adds enterprise features like spend tracking and fallbacks.