JuliaTrustworthyAI/ConformalPrediction.jl
Predictive Uncertainty Quantification through Conformal Prediction for Machine Learning models trained in MLJ.
When you're working with machine learning models and need to understand the reliability of their predictions, this tool helps you quantify that uncertainty. It takes your existing Julia machine learning model and data, and instead of just giving you a single prediction, it provides a range (prediction interval) that is guaranteed to contain the true outcome with a high probability. This is ideal for scientists, engineers, or analysts who rely on model predictions for critical decisions.
146 stars.
Use this if you need statistically rigorous, distribution-free guarantees on the uncertainty of your machine learning model's predictions in Julia.
Not ideal if you are looking for an explanation of *why* the model made a certain prediction, as this focuses solely on the reliability of the prediction itself.
Stars
146
Forks
10
Language
Julia
License
MIT
Category
Last pushed
Feb 17, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/JuliaTrustworthyAI/ConformalPrediction.jl"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
valeman/awesome-conformal-prediction
A professionally curated list of awesome Conformal Prediction videos, tutorials, books, papers,...
zillow/quantile-forest
Quantile Regression Forests compatible with scikit-learn.
yromano/cqr
Conformalized Quantile Regression
henrikbostrom/crepes
Python package for conformal prediction
xRiskLab/pearsonify
Lightweight Python package for generating classification intervals in binary classification...