ErickWendel/monitoring-llms-langfuse-ollama
Examples of how to monitor the OpenAI SDK calls using Langfuse
For developers building AI applications, this project helps you monitor and analyze how your AI models are performing. It takes your AI application's requests and outputs detailed traces, costs, and performance metrics, helping you understand your models' behavior. This is ideal for AI application developers and engineers who want to observe and debug their LLM integrations.
No commits in the last 6 months.
Use this if you are developing AI applications and need a self-hosted solution to trace, monitor, and evaluate your OpenAI SDK or local Ollama LLM calls.
Not ideal if you need a production-ready monitoring system without further configuration for scalability, security, and SSL/TLS.
Stars
19
Forks
10
Language
JavaScript
License
MIT
Category
Last pushed
Apr 07, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ErickWendel/monitoring-llms-langfuse-ollama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
Arcotic-Solutions-Ltd/slash-engine
Slash Engine - Web based UI (Interface) for your locally hosted AI models.
UmairThakur/UPT-Junior-Analyst
AI Tool for quick Data Analysis, Visualisation and model development but much more smarter and...
leodiegues/pinkmess
A completely opinionated PKMS terminal manager with AI features for lazy people just like me.
zitekjan1/pwnagotchi-store
🛒 Simplify Pwnagotchi management with PwnStore, a lightweight CLI package manager for seamless...
vishal-singh-baraiya/aipaze
AIPaze is a comprehensive framework for connecting large language models (LLMs) to external...