szilard/GBM-perf
Performance of various open source GBM implementations
This project helps data scientists and machine learning engineers understand the real-world performance of popular gradient boosting machine (GBM) implementations. It compares training times and accuracy (AUC) of tools like H2O, XGBoost, LightGBM, and CatBoost on large datasets, using both CPU and GPU hardware configurations. You can use this to make informed decisions about which GBM library will perform best for your specific predictive modeling tasks.
224 stars.
Use this if you are a data scientist or machine learning engineer building predictive models and need to choose the most performant gradient boosting library for your specific dataset size and hardware (CPU or GPU).
Not ideal if you are looking for a general introduction to gradient boosting or need to compare different machine learning algorithm types beyond GBMs.
Stars
224
Forks
30
Language
HTML
License
MIT
Category
Last pushed
Feb 17, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/szilard/GBM-perf"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models