rmitsuboshi/miniboosts
A collection of boosting algorithms written in Rust 🦀
This tool helps machine learning researchers compare and develop boosting algorithms. You input training data and a weak learner model, and it outputs a strong hypothesis model built by combining many weak learners. It's used by researchers who are exploring new boosting algorithms or need to benchmark existing ones without the performance limitations of other programming languages.
No commits in the last 6 months.
Use this if you are a researcher designing or evaluating new boosting algorithms and need a robust, high-performance platform for comparison.
Not ideal if you are a practitioner looking for an off-the-shelf solution to apply machine learning models to business problems, rather than research on the algorithms themselves.
Stars
57
Forks
9
Language
Rust
License
MIT
Category
Last pushed
May 19, 2025
Monthly downloads
20
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/rmitsuboshi/miniboosts"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models