arogozhnikov/infiniteboost
InfiniteBoost: building infinite ensembles with gradient descent
This project offers an ensemble modeling technique that combines the strengths of random forests and gradient boosting to make highly accurate predictions. You input your labeled dataset, and it outputs a robust predictive model that avoids common overfitting issues. Data scientists, machine learning engineers, and researchers who build predictive models will find this useful for improving model performance and generalization.
183 stars. No commits in the last 6 months.
Use this if you need a predictive model that combines the high accuracy of gradient boosting with the overfitting resistance of random forests.
Not ideal if you need an out-of-the-box solution with a user-friendly interface or if you are not comfortable with command-line tools and scripting.
Stars
183
Forks
22
Language
Jupyter Notebook
License
—
Category
Last pushed
Sep 17, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/arogozhnikov/infiniteboost"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models