Affirm/splinator

Splinator: probabilistic calibration with regression splines

41
/ 100
Emerging

When you have a predictive model that outputs probabilities, this tool helps you check if those probabilities are trustworthy and then makes them more accurate. You input your model's probability predictions and the actual outcomes, and it outputs improved, more reliable probabilities. This is for anyone who uses probability predictions in their work, such as risk analysts, forecasters, or decision scientists, who needs to ensure their model's confidence scores are well-aligned with reality.

Use this if your models are making probability predictions (like 'there's an 80% chance of rain') and you need to ensure those probabilities are realistic and not over- or under-confident.

Not ideal if you are primarily interested in the raw classification accuracy of your model rather than the quality of its probability estimates.

predictive-modeling risk-assessment forecasting decision-science model-validation
No Package No Dependents
Maintenance 6 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

24

Forks

4

Language

Python

License

BSD-3-Clause

Last pushed

Jan 01, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Affirm/splinator"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.