henrikbostrom/crepes
Python package for conformal prediction
This tool helps data scientists and machine learning engineers quantify the uncertainty in their machine learning models' predictions. You provide a trained classifier or regressor, and it outputs well-calibrated p-values, prediction sets, or prediction intervals. This allows you to understand how confident your model is for each prediction, ensuring more reliable and trustworthy results.
558 stars. No commits in the last 6 months. Available on PyPI.
Use this if you need to add statistically rigorous confidence measures and reliability guarantees to your existing machine learning model's predictions, rather than just point estimates.
Not ideal if you are looking for tools to build, train, or evaluate the core predictive performance of a machine learning model itself.
Stars
558
Forks
45
Language
Python
License
BSD-3-Clause
Category
Last pushed
Oct 09, 2025
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/henrikbostrom/crepes"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
valeman/awesome-conformal-prediction
A professionally curated list of awesome Conformal Prediction videos, tutorials, books, papers,...
zillow/quantile-forest
Quantile Regression Forests compatible with scikit-learn.
yromano/cqr
Conformalized Quantile Regression
xRiskLab/pearsonify
Lightweight Python package for generating classification intervals in binary classification...
rick12000/confopt
A Library for Conformal Hyperparameter Tuning