obirler/LLMProxy

LLMProxy is an intelligent large language model backend routing proxy service.

38
/ 100
Emerging

This tool helps developers who are building applications that use large language models (LLMs) to manage their connections to various LLM providers like OpenAI, Google Gemini, or local models. It acts as a central hub, taking requests from your application and intelligently routing them to the best available LLM backend. This ensures your application remains responsive and reliable, even if some providers experience issues or you need to combine outputs from multiple models.

Use this if you are developing an application that relies on LLMs and you want to abstract away the complexities of managing multiple API keys, routing requests, handling errors, and orchestrating different LLM providers or models.

Not ideal if you only use a single, stable LLM provider for a simple application and do not anticipate needing advanced routing, load balancing, or failover capabilities.

LLM application development API management backend orchestration microservices reliability engineering
No Package No Dependents
Maintenance 6 / 25
Adoption 6 / 25
Maturity 15 / 25
Community 11 / 25

How are scores calculated?

Stars

22

Forks

3

Language

C#

License

MIT

Category

llm-api-gateways

Last pushed

Dec 06, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/obirler/LLMProxy"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.