unifyroute/unifyroute
Stop being locked into one LLM provider. UnifyRoute is a self-hosted gateway that routes, fails over, and manages quotas across OpenAI, Anthropic, and more — with a drop-in OpenAI-compatible API.
This gateway helps you manage your large language model (LLM) requests by routing them across various providers like OpenAI and Anthropic. You feed it your prompts or application requests, and it delivers responses from the chosen LLM, offering failover if one provider is down and managing your usage quotas. It's ideal for developers and operations teams building or maintaining applications that rely on multiple LLM services.
Use this if you need to ensure your AI applications remain resilient by automatically switching between different LLM providers and want better visibility into your LLM usage and costs.
Not ideal if you only use a single LLM provider and don't require advanced routing, failover, or quota management.
Stars
8
Forks
4
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/unifyroute/unifyroute"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
neuraxes/neurouter
A powerful router that provides a unified interface for all upstream LLMs
CarloLepelaars/irouter
Access 100s of LLMs with minimal lines of code
deeflect/smart-spawn
Intelligent model routing for AI agents. Auto-selects the right LLM per task based on...
peva3/SmarterRouter
SmarterRouter: An intelligent LLM gateway and VRAM-aware router for Ollama, llama.cpp, and...
r9s-ai/open-next-router
A lightweight, DSL-driven LLM gateway for routing, patching provider quirks, and normalizing...