ferro-labs/ai-gateway
Open-source AI Gateway written in Go, one API for OpenAI, Anthropic, Bedrock, Azure, and 100+ LLMs. Built-in caching, guardrails, retries, and cost optimization. Run as a proxy or embed as a library.
This tool helps developers manage and optimize their application's interactions with various large language models (LLMs) from different providers like OpenAI, Anthropic, and AWS Bedrock. It acts as a single access point, taking in standard LLM requests and routing them to the best-suited provider, while also adding features like caching, cost controls, and security guardrails. It's ideal for developers building AI-powered applications who need robust, high-performance LLM integration.
Use this if you are a developer building an application that needs to reliably and efficiently interact with multiple LLM providers, manage costs, and enforce security policies.
Not ideal if you are a non-developer user looking for a ready-to-use AI application, as this is a technical tool for integrating LLMs into existing systems.
Stars
19
Forks
5
Language
Go
License
Apache-2.0
Category
Last pushed
Mar 13, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ferro-labs/ai-gateway"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with...
vava-nessa/free-coding-models
Find, benchmark and install in CLI 158 FREE coding LLM models across 20 providers in real time
envoyproxy/ai-gateway
Manages Unified Access to Generative AI Services built on Envoy Gateway
theopenco/llmgateway
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
Portkey-AI/gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with...