unifyroute/unifyroute

Stop being locked into one LLM provider. UnifyRoute is a self-hosted gateway that routes, fails over, and manages quotas across OpenAI, Anthropic, and more — with a drop-in OpenAI-compatible API.

40
/ 100
Emerging

This gateway helps you manage your large language model (LLM) requests by routing them across various providers like OpenAI and Anthropic. You feed it your prompts or application requests, and it delivers responses from the chosen LLM, offering failover if one provider is down and managing your usage quotas. It's ideal for developers and operations teams building or maintaining applications that rely on multiple LLM services.

Use this if you need to ensure your AI applications remain resilient by automatically switching between different LLM providers and want better visibility into your LLM usage and costs.

Not ideal if you only use a single LLM provider and don't require advanced routing, failover, or quota management.

AI application development LLM operations API management cloud cost optimization system reliability
No Package No Dependents
Maintenance 10 / 25
Adoption 4 / 25
Maturity 11 / 25
Community 15 / 25

How are scores calculated?

Stars

8

Forks

4

Language

Python

License

Apache-2.0

Last pushed

Mar 11, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/unifyroute/unifyroute"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.