jmamda/OpenTrace

A local reverse proxy that records every LLM request/response to SQLite. No cloud, no data leaving your machine.

38
/ 100
Emerging

This tool helps developers understand and debug their applications' interactions with large language models (LLMs) by recording every request and response. It takes your application's LLM calls, captures the details, and stores them in a local database. The output is a detailed log of all LLM traffic, allowing you to review prompts, responses, costs, and latencies.

Use this if you need to log and inspect all LLM calls your application makes, want to track costs and latencies locally without sending data to a third-party service, and prefer a simple setup over complex infrastructure.

Not ideal if you need a shared, enterprise-grade observability platform with team collaboration features and advanced analytics beyond local inspection.

LLM-development application-debugging cost-monitoring privacy-focused local-observability
No Package No Dependents
Maintenance 10 / 25
Adoption 4 / 25
Maturity 11 / 25
Community 13 / 25

How are scores calculated?

Stars

7

Forks

2

Language

Rust

License

MIT

Last pushed

Mar 01, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/jmamda/OpenTrace"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.