henrikbostrom/crepes

Python package for conformal prediction

52
/ 100
Established

This tool helps data scientists and machine learning engineers quantify the uncertainty in their machine learning models' predictions. You provide a trained classifier or regressor, and it outputs well-calibrated p-values, prediction sets, or prediction intervals. This allows you to understand how confident your model is for each prediction, ensuring more reliable and trustworthy results.

558 stars. No commits in the last 6 months. Available on PyPI.

Use this if you need to add statistically rigorous confidence measures and reliability guarantees to your existing machine learning model's predictions, rather than just point estimates.

Not ideal if you are looking for tools to build, train, or evaluate the core predictive performance of a machine learning model itself.

machine-learning-reliability predictive-uncertainty model-confidence statistical-guarantees algorithmic-trustworthiness
Stale 6m
Maintenance 2 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 15 / 25

How are scores calculated?

Stars

558

Forks

45

Language

Python

License

BSD-3-Clause

Last pushed

Oct 09, 2025

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/henrikbostrom/crepes"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.