traceloop/go-openllmetry
Sister project to OpenLLMetry, but in Go. Open-source observability for your LLM application, based on OpenTelemetry
This project helps Go developers gain visibility into how their Large Language Model (LLM) applications are performing. It takes the requests and responses from your LLM calls (like prompts and completions) and transforms them into standard observability data. The output can then be sent to various monitoring tools you already use, allowing you to track and troubleshoot your LLM application's behavior.
Use this if you are a Go developer building applications with LLMs and need a way to monitor their performance, understand user interactions, and debug issues within your existing observability platforms.
Not ideal if you are not developing in Go, or if you do not have an existing observability stack and are looking for a standalone LLM monitoring solution.
Stars
41
Forks
15
Language
Go
License
Apache-2.0
Category
Last pushed
Jan 17, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/traceloop/go-openllmetry"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with...
vava-nessa/free-coding-models
Find, benchmark and install in CLI 158 FREE coding LLM models across 20 providers in real time
envoyproxy/ai-gateway
Manages Unified Access to Generative AI Services built on Envoy Gateway
theopenco/llmgateway
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
Portkey-AI/gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with...