dext7r/ollama-api-pool

🚀 Intelligent Ollama API proxy pool based on Cloudflare Workers - 基于 Cloudflare Workers 的智能 Ollama API 代理池,支持多账号轮询、自动故障转移、负载均衡和统一鉴权

49
/ 100
Emerging

This tool helps developers who are building applications that use large language models (LLMs) like Ollama or OpenRouter. It acts as a smart central access point, taking requests for LLM services and intelligently routing them to available API keys. The result is more reliable and efficient access to LLMs, even if individual keys fail, making it easier for developers to integrate AI features into their products.

Use this if you are a developer managing multiple API keys for Ollama or OpenRouter and need a robust way to ensure continuous, load-balanced access to LLM services.

Not ideal if you only use a single LLM API key for casual or low-volume requests, as the overhead of setting up a proxy pool might be unnecessary.

LLM-integration API-management application-development system-architecture AI-development
No Package No Dependents
Maintenance 10 / 25
Adoption 6 / 25
Maturity 15 / 25
Community 18 / 25

How are scores calculated?

Stars

20

Forks

12

Language

JavaScript

License

MIT

Last pushed

Jan 21, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/dext7r/ollama-api-pool"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.