llmgateway and lm-proxy

These are competitors offering overlapping core functionality—both provide unified API gateways to route requests across multiple LLM providers—though lm-proxy emphasizes OpenAI compatibility and lightweight deployment while llmgateway adds request analytics and management features.

llmgateway
68
Established
lm-proxy
55
Established
Maintenance 22/25
Adoption 10/25
Maturity 16/25
Community 20/25
Maintenance 10/25
Adoption 9/25
Maturity 24/25
Community 12/25
Stars: 948
Forks: 108
Downloads:
Commits (30d): 181
Language: TypeScript
License:
Stars: 92
Forks: 10
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
No risk flags

About llmgateway

theopenco/llmgateway

Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.

This helps developers who build applications using Large Language Models (LLMs) by acting as a central hub for all LLM interactions. It takes your application's LLM requests and routes them to various providers like OpenAI or Anthropic, while also managing API keys, tracking usage, and analyzing performance. Developers can use this to streamline their LLM infrastructure and gain insights into costs and model effectiveness.

LLM-application-development API-management AI-infrastructure usage-analytics performance-monitoring

About lm-proxy

Nayjest/lm-proxy

OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.

This tool helps developers and system architects manage their use of Large Language Models (LLMs) from various providers like OpenAI, Anthropic, or Google, as well as local models. It acts as a single access point, allowing you to send requests using the familiar OpenAI API format, and the proxy intelligently routes them to the correct LLM. You input your LLM requests and configuration, and it outputs responses from the chosen models, simplifying multi-provider setups.

LLM management API integration backend development AI infrastructure multi-model deployment

Scores updated daily from GitHub, PyPI, and npm data. How scores work