ottic-ai/llm-gateway

Open-source library built for fast and reliable connections to different LLM providers

38
/ 100
Emerging

This library helps developers reliably connect their applications to large language models (LLMs) from various providers like OpenAI, Anthropic, and Azure. You provide it with your LLM requests, and it ensures consistent responses, even if a primary provider fails, by automatically switching to a backup. Developers building applications that rely on LLMs would use this.

No commits in the last 6 months. Available on npm.

Use this if you are a developer building an application that needs to interact with different LLM providers and requires robust error handling, consistent output, and low-latency responses.

Not ideal if you are an end-user looking for a no-code solution or if your application only ever uses a single LLM provider without any need for failover or unified API management.

LLM application development API integration backend development service reliability microservices
Stale 6m
Maintenance 0 / 25
Adoption 4 / 25
Maturity 25 / 25
Community 9 / 25

How are scores calculated?

Stars

7

Forks

1

Language

TypeScript

License

MIT

Category

llm-api-gateways

Last pushed

Dec 23, 2024

Commits (30d)

0

Dependencies

4

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ottic-ai/llm-gateway"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.