one-api and fastapi-web
These are competitors offering overlapping LLM API aggregation and management functionality—both provide unified API adapters across multiple LLM providers with Docker deployment, so users would typically select one based on feature completeness and ecosystem maturity rather than using them together.
About one-api
songquanpeng/one-api
LLM API 管理 & 分发系统,支持 OpenAI、Azure、Anthropic Claude、Google Gemini、DeepSeek、字节豆包、ChatGLM、文心一言、讯飞星火、通义千问、360 智脑、腾讯混元等主流模型,统一 API 适配,可用于 key 管理与二次分发。单可执行文件,提供 Docker 镜像,一键部署,开箱即用。LLM API management & key redistribution system, unifying multiple providers under a single API. Single binary, Docker-ready, with an English UI.
This system helps organizations centralize access to various large language models (LLMs) like OpenAI, Google Gemini, and Anthropic Claude. It takes your existing API keys from these different providers and unifies them under a single, standard API endpoint. This is ideal for businesses, developers, or teams who need to manage and distribute access to multiple LLM services efficiently.
About fastapi-web
iimeta/fastapi-web
企业级 LLM API 快速集成系统,支持OpenAI、Azure、文心一言、讯飞星火、通义千问、智谱GLM、Gemini、DeepSeek、Anthropic Claude以及OpenAI格式的模型等,简洁的页面风格,轻量高效且稳定,支持Docker一键部署。
This system simplifies integrating multiple large language models (LLMs) like OpenAI, Azure, and Gemini into your business applications. It provides a unified API standard, allowing your systems to connect to various LLMs with a single integration, abstracting away the complexities of each individual model. Developers building business systems that leverage AI capabilities would use this to streamline their LLM integration efforts.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work