llmgateway and LLM-API-Key-Proxy
These are competitors offering overlapping core functionality—both provide unified API gateways with multi-provider routing and request management—though the first emphasizes analytics while the second emphasizes load-balancing and protocol compatibility.
About llmgateway
theopenco/llmgateway
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
This helps developers who build applications using Large Language Models (LLMs) by acting as a central hub for all LLM interactions. It takes your application's LLM requests and routes them to various providers like OpenAI or Anthropic, while also managing API keys, tracking usage, and analyzing performance. Developers can use this to streamline their LLM infrastructure and gain insights into costs and model effectiveness.
About LLM-API-Key-Proxy
Mirrowel/LLM-API-Key-Proxy
Universal LLM Gateway: One API, every LLM. OpenAI/Anthropic-compatible endpoints with multi-provider translation and intelligent load-balancing.
This tool helps individuals or small teams who work with various Large Language Models (LLMs) and need a simpler, more reliable way to manage them. You put in your different LLM API keys and model choices, and it gives you one single access point that works with almost any existing LLM application. This is ideal for developers, researchers, or anyone building applications that use LLMs.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work