litellm and llmgateway

These are **competitors**: both provide unified API gateways to abstract away provider-specific implementations and route requests across multiple LLM providers, with litellm being significantly more mature and feature-rich (cost tracking, guardrails, loadbalancing) while llmgateway offers a simpler alternative for basic multi-provider routing and analysis.

litellm
85
Verified
llmgateway
68
Established
Maintenance 22/25
Adoption 15/25
Maturity 25/25
Community 23/25
Maintenance 22/25
Adoption 10/25
Maturity 16/25
Community 20/25
Stars: 38,910
Forks: 6,381
Downloads:
Commits (30d): 1497
Language: Python
License:
Stars: 948
Forks: 108
Downloads:
Commits (30d): 181
Language: TypeScript
License:
No risk flags
No Package No Dependents

About litellm

BerriAI/litellm

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

This project helps developers integrate over 100 large language models (LLMs) and AI agents into their applications without worrying about API differences. It takes requests in a standardized format, routes them to various LLM providers, and handles responses, enabling easier management and deployment of AI-powered features. Developers and AI engineers building diverse applications powered by multiple LLMs are the primary users.

AI-application-development LLM-integration AI-gateway API-management developer-tooling

About llmgateway

theopenco/llmgateway

Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.

This helps developers who build applications using Large Language Models (LLMs) by acting as a central hub for all LLM interactions. It takes your application's LLM requests and routes them to various providers like OpenAI or Anthropic, while also managing API keys, tracking usage, and analyzing performance. Developers can use this to streamline their LLM infrastructure and gain insights into costs and model effectiveness.

LLM-application-development API-management AI-infrastructure usage-analytics performance-monitoring

Scores updated daily from GitHub, PyPI, and npm data. How scores work