xRiskLab/pearsonify
Lightweight Python package for generating classification intervals in binary classification tasks using Pearson residuals and conformal prediction
When you have a system that predicts a 'yes' or 'no' outcome (like customer churn or disease presence), this tool helps you understand how confident those predictions are. You input your existing prediction model and its probability scores, and it outputs a range of probabilities for each prediction. Data scientists, machine learning engineers, and analysts who build and evaluate binary classification models would find this useful.
Available on PyPI.
Use this if you need to add statistically sound, intuitive confidence intervals to your binary classification model's predictions without making strong assumptions about your data.
Not ideal if you are looking for a tool for multi-class classification or if you do not have a pre-trained model that outputs probability estimates.
Stars
23
Forks
2
Language
Python
License
MIT
Category
Last pushed
Feb 20, 2026
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/xRiskLab/pearsonify"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
zillow/quantile-forest
Quantile Regression Forests compatible with scikit-learn.
valeman/awesome-conformal-prediction
A professionally curated list of awesome Conformal Prediction videos, tutorials, books, papers,...
yromano/cqr
Conformalized Quantile Regression
henrikbostrom/crepes
Python package for conformal prediction
rick12000/confopt
A Library for Conformal Hyperparameter Tuning