Scale3-Labs/langtrace

Langtrace 🔍 is an open-source, Open Telemetry based end-to-end observability tool for LLM applications, providing real-time tracing, evaluations and metrics for popular LLMs, LLM frameworks, vectorDBs and more.. Integrate using Typescript, Python. 🚀💻📊

51
/ 100
Established

This tool helps developers understand and improve their AI applications that use large language models (LLMs). It takes information about how your LLM application is running, including its interactions with LLM APIs, vector databases, and frameworks. In return, you get real-time traces, performance insights like latency and cost, and debugging tools to identify issues. This is for software developers and AI engineers building and maintaining LLM-powered applications.

1,184 stars.

Use this if you are building an application with LLMs and need to monitor its performance, debug issues, and analyze how it interacts with various AI services in real time.

Not ideal if you are looking for a non-technical, end-user analytics tool or if your application does not integrate with LLMs, vector databases, or LLM frameworks.

AI-application-development LLM-observability application-monitoring AI-debugging AI-engineering
No Package No Dependents
Maintenance 6 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 19 / 25

How are scores calculated?

Stars

1,184

Forks

120

Language

TypeScript

License

AGPL-3.0

Last pushed

Nov 17, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/Scale3-Labs/langtrace"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.