CaesarYangs/prometheus_mcp_server

A Model Context Protocol (MCP) server enabling LLMs to query, analyze, and interact with Prometheus databases through predefined routes.

40
/ 100
Emerging

This tool helps Site Reliability Engineers and DevOps professionals understand their system's health and performance by connecting Large Language Models (LLMs) to Prometheus databases. It allows LLMs to retrieve specific metrics, analyze data within custom time ranges, and explore usage patterns without needing to write complex PromQL queries manually. The output includes structured data and analyses from your Prometheus metrics.

No commits in the last 6 months.

Use this if you are an SRE or DevOps engineer who wants to use an LLM to query and analyze your Prometheus monitoring data for system health, performance insights, or debugging.

Not ideal if you don't use Prometheus for monitoring or if you prefer to write PromQL queries manually without LLM assistance.

Site Reliability Engineering DevOps System Monitoring Incident Response Performance Analysis
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

34

Forks

11

Language

Python

License

MIT

Last pushed

Apr 05, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/CaesarYangs/prometheus_mcp_server"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.