litellm and openai-forward
These are **competitors** — both provide API gateway/proxy functionality to standardize access to multiple LLM providers, though LiteLLM offers significantly more features (cost tracking, guardrails, load balancing) while openai-forward focuses on lightweight reverse proxying.
About litellm
BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
This project helps developers integrate over 100 large language models (LLMs) and AI agents into their applications without worrying about API differences. It takes requests in a standardized format, routes them to various LLM providers, and handles responses, enabling easier management and deployment of AI-powered features. Developers and AI engineers building diverse applications powered by multiple LLMs are the primary users.
About openai-forward
KenyonY/openai-forward
🚀 大语言模型高效转发服务 · An efficient forwarding service designed for LLMs. · OpenAI API Reverse Proxy
This service helps developers and teams manage their interactions with large language models, whether they are hosted locally or in the cloud. It takes your requests for AI models like OpenAI or Google Gemini, and then efficiently forwards them, handling tasks like rate limiting, caching AI predictions, and managing API keys. The output is a faster, more controlled, and more cost-effective way to use these powerful models.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work