gchers/random-world

Standalone implementation of Conformal Prediction and other distribution-free Machine Learning methods.

40
/ 100
Emerging

This project helps data scientists, machine learning engineers, and researchers assess the reliability of their machine learning predictions. It takes in structured data, typically in CSV files, that includes features and labels. The output provides confidence levels (p-values) or explicit predictions, helping users understand when their model's predictions might be less certain.

No commits in the last 6 months.

Use this if you need to quantify the confidence or uncertainty around individual predictions from your machine learning models, especially when dealing with critical applications or when data might deviate from expected patterns.

Not ideal if you are looking for a general-purpose machine learning library to build models from scratch, as its primary focus is on prediction confidence and exchangeability testing, not model training.

predictive-modeling model-validation uncertainty-quantification anomaly-detection data-stream-analysis
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

14

Forks

4

Language

Rust

License

MIT

Last pushed

Aug 23, 2019

Monthly downloads

80

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/gchers/random-world"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.