oracle/macest

Model Agnostic Confidence Estimator (MACEST) - A Python library for calibrating Machine Learning models' confidence scores

54
/ 100
Established

When you rely on machine learning for critical decisions, MACEst helps you understand how trustworthy each individual prediction is. It takes your existing model's predictions and data, then outputs a confidence score for classification tasks or a confidence interval for regression tasks. This is for data scientists, machine learning engineers, and analysts working with supervised learning models in high-stakes fields.

100 stars. No commits in the last 6 months. Available on PyPI.

Use this if you need to know how confident your machine learning model is about each specific prediction, especially when wrong predictions can have significant consequences.

Not ideal if you only care about overall model accuracy and don't need to assess the certainty of individual predictions or understand when your model is operating outside its known data.

predictive-modeling risk-assessment model-validation data-quality decision-support
Stale 6m
Maintenance 2 / 25
Adoption 9 / 25
Maturity 25 / 25
Community 18 / 25

How are scores calculated?

Stars

100

Forks

18

Language

Jupyter Notebook

License

UPL-1.0

Last pushed

Sep 26, 2025

Commits (30d)

0

Dependencies

8

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/oracle/macest"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.