traceloop/hub

High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included

55
/ 100
Established

This is a high-performance gateway that helps developers and MLOps engineers manage their Large Language Model (LLM) integrations. It takes requests for LLM operations (like chat completions or embeddings) and routes them to various LLM providers, providing a unified API. The output is a consistent way to interact with different LLMs and built-in observability for monitoring their usage and performance.

172 stars.

Use this if you are a developer or MLOps engineer building applications that need to use multiple LLMs, require high performance, and need detailed observability for all LLM interactions.

Not ideal if you are a data scientist or researcher who primarily uses a single LLM provider through its native SDK and doesn't require advanced routing or centralized observability for distributed applications.

LLM-operations API-management MLOps application-development observability
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 19 / 25

How are scores calculated?

Stars

172

Forks

31

Language

Rust

License

Apache-2.0

Category

llm-api-gateways

Last pushed

Mar 04, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/traceloop/hub"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.