openai-forward and lm-proxy
These are competitors offering similar core functionality—both are API gateway/proxy services that forward requests to multiple LLM providers—though they differ in implementation language (Go vs. Python/FastAPI) and maturity levels.
About openai-forward
KenyonY/openai-forward
🚀 大语言模型高效转发服务 · An efficient forwarding service designed for LLMs. · OpenAI API Reverse Proxy
This service helps developers and teams manage their interactions with large language models, whether they are hosted locally or in the cloud. It takes your requests for AI models like OpenAI or Google Gemini, and then efficiently forwards them, handling tasks like rate limiting, caching AI predictions, and managing API keys. The output is a faster, more controlled, and more cost-effective way to use these powerful models.
About lm-proxy
Nayjest/lm-proxy
OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.
This tool helps developers and system architects manage their use of Large Language Models (LLMs) from various providers like OpenAI, Anthropic, or Google, as well as local models. It acts as a single access point, allowing you to send requests using the familiar OpenAI API format, and the proxy intelligently routes them to the correct LLM. You input your LLM requests and configuration, and it outputs responses from the chosen models, simplifying multi-provider setups.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work