sustainable-computing-io/kepler-model-server
Model Server for Kepler
This project helps developers integrate and manage power consumption models within the Kepler ecosystem. It takes pre-trained power models as input and provides estimated power usage metrics for running workloads. Site reliability engineers, DevOps engineers, and cloud architects who manage infrastructure and seek to optimize energy efficiency would use this.
Use this if you are a developer looking to deploy and serve power consumption estimation models for your cloud-native applications and infrastructure.
Not ideal if you are an end-user simply looking for a dashboard to view power consumption metrics, as this requires technical deployment and integration.
Stars
29
Forks
26
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 02, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/sustainable-computing-io/kepler-model-server"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
feast-dev/feast
The Open Source Feature Store for AI/ML
clearml/clearml-serving
ClearML - Model-Serving Orchestration and Repository Solution
lakehq/sail
LakeSail's computation framework with a mission to unify batch processing, stream processing,...
PaddlePaddle/Serving
A flexible, high-performance carrier for machine learning models(『飞桨』服务化部署框架)
SeldonIO/MLServer
An inference server for your machine learning models, including support for multiple frameworks,...