LiteLLM

Gateway proxy open source para 100+ proveedores LLM con API compatible con OpenAI—enrutado, presupuestos, fallbacks y logs.

Inferencia / Hosting开源代理聚合
Sitio oficialSe abre en una pestaña nueva

Ideal para

Teams routing between multiple LLM vendors with unified billing, auditing, and gradual rollouts—a common enterprise AI gateway.

Menos adecuado si

Single-vendor setups, or teams wanting zero self-hosted middleware.

Al comparar

Vs OpenRouter / Portkey / Braintrust Proxy: LiteLLM is OSS-flexible with broad ecosystem; OpenRouter is consumer-grade routing; Portkey leans into guardrails and caching.

Lista rápida

  • Design API key / multi-tenant isolation and auditing
  • Define rate limits and fallback policies
  • Monitor per-model cost/latency baselines
  • Size self-hosted deployment or LiteLLM Cloud capacity

Preguntas frecuentes (búsqueda)

LiteLLM vs OpenRouter—how to choose?

LiteLLM acts like an internal AI gateway in your VPC with policy/audit depth; OpenRouter is a consumer-friendly model marketplace. Many teams combine both—OpenRouter as catalog, LiteLLM as governance layer.

Casos de uso

El resumen ayuda a decidir si la herramienta encaja. Si hay muchas parecidas, define frecuencia, presupuesto y privacidad antes de elegir.

Herramientas relacionadas