ottic-ai/llm-gateway
Open-source library built for fast and reliable connections to different LLM providers
This library helps developers reliably connect their applications to large language models (LLMs) from various providers like OpenAI, Anthropic, and Azure. You provide it with your LLM requests, and it ensures consistent responses, even if a primary provider fails, by automatically switching to a backup. Developers building applications that rely on LLMs would use this.
No commits in the last 6 months. Available on npm.
Use this if you are a developer building an application that needs to interact with different LLM providers and requires robust error handling, consistent output, and low-latency responses.
Not ideal if you are an end-user looking for a no-code solution or if your application only ever uses a single LLM provider without any need for failover or unified API management.
Stars
7
Forks
1
Language
TypeScript
License
MIT
Category
Last pushed
Dec 23, 2024
Commits (30d)
0
Dependencies
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ottic-ai/llm-gateway"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with...
vava-nessa/free-coding-models
Find, benchmark and install in CLI 158 FREE coding LLM models across 20 providers in real time
envoyproxy/ai-gateway
Manages Unified Access to Generative AI Services built on Envoy Gateway
theopenco/llmgateway
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
Portkey-AI/gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with...