Xtra-Computing/thundergbm
ThunderGBM: Fast GBDTs and Random Forests on GPUs
This tool helps data scientists and machine learning practitioners quickly build predictive models using Gradient Boosted Decision Trees (GBDTs) and Random Forests. It takes your raw dataset as input and outputs a trained model capable of performing classifications, regressions, or rankings. It's designed for users who need to develop high-performing models efficiently.
711 stars. No commits in the last 6 months.
Use this if you need to rapidly train GBDT or Random Forest models on large datasets and have access to a GPU.
Not ideal if you are using a Mac operating system or do not have access to an NVIDIA GPU.
Stars
711
Forks
87
Language
C++
License
Apache-2.0
Category
Last pushed
Mar 19, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Xtra-Computing/thundergbm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models