ArtCC/serverwatch-ai-bot
A single-user Telegram bot that monitors your server and answers questions about it using a local LLM via Ollama. Metrics are sourced from Glances running in the same Docker Compose stack.
Stars
—
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 15, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ArtCC/serverwatch-ai-bot"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
innightwolfsleep/llm_telegram_bot
LLM telegram bot
pond918/llm-bots
npm package of llm bots with tree-structured chat histories. can be deployed on local browser,...
iongpt/LLM-for-Whatsapp
WhatsApp auto responder with LLM integration. It support OpenAI API and also local LLMs
Engine-Labs/engine-core
Chat strategies for LLMs
flows-network/telegram-llm
A Telegram LLM bot written in Rust