matdev83/llm-interactive-proxy

Connect any LLM-powered client app, such as a coding agent, to any supported inference backend/model.

39
/ 100
Emerging

This project helps developers connect their AI-powered applications, like coding agents, to various large language models (LLMs) from different providers. It takes your existing AI client application and routes its requests to your chosen LLM backend, adding features like security, cost control, and failover. Developers who build and manage AI applications will find this useful for greater flexibility and control over their LLM integrations.

Use this if you need to integrate your AI application with multiple LLM providers, add robust security, or gain more control and observability over your agentic workflows without rewriting your client code.

Not ideal if you are a casual user of a single LLM API and do not require advanced routing, security, or multi-provider management.

AI-application-development LLM-integration agentic-AI API-management production-AI-systems
No Package No Dependents
Maintenance 13 / 25
Adoption 6 / 25
Maturity 15 / 25
Community 5 / 25

How are scores calculated?

Stars

15

Forks

1

Language

Python

License

AGPL-3.0

Category

llm-api-gateways

Last pushed

Mar 25, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/matdev83/llm-interactive-proxy"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.