What is litellm?
litellm is a infrastructure that Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropi. It has a Nerq Trust Score of 78/100 (B). 36.4K GitHub stars. Published by BerriAI. Last analyzed March 2026.
Why This Score
- ⚠️ Security: 0/100 — Some security concerns
- ⚠️ Maintenance: 1/100 — Maintenance activity is low
- ✅ Community: 36.4K stars, 0 downloads — Large community
- ✅ Transparency: License: NOASSERTION — Clear licensing
Trust & Safety Overview
What litellm Does
litellm is a mcp_server in the infrastructure category. Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]. It is published by BerriAI and is open source. With 36.4K GitHub stars and 0 downloads, it has a large and active community of users and contributors.
Who Should Use litellm
litellm is well-suited for production use given its strong trust score and active community.
Details
| Author | BerriAI |
|---|---|
| Category | infrastructure |
| License | NOASSERTION |
| Type | mcp_server |
| Source | View on GitHub |
| Security Score | 0/100 |
| Activity Score | 1/100 |
How to Get Started
Check the trust score before installing:
curl nerq.ai/v1/preflight?target=berriai-litellm
Setup guide · Full safety report · Production review · Is it safe?
Safer Alternatives
| Tool | Trust | Stars |
|---|---|---|
| n8n | 78 | 177.3K |
| langflow | 88 | 145.4K |
| dify | 79 | 130.8K |
| open-webui | 75 | 124.5K |
| gemini-cli | 72 | 98.5K |
Frequently Asked Questions
Last updated March 2026. Trust scores based on automated analysis of public data.