dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
XGBoost helps data scientists and machine learning engineers quickly build highly accurate predictive models for classification, regression, and ranking tasks. It takes structured datasets (like spreadsheets or database tables) and outputs a powerful model capable of making predictions. This tool is ideal for professionals who need to develop robust and efficient predictive analytics solutions.
28,121 stars. Used by 122 other packages. Actively maintained with 38 commits in the last 30 days. Available on PyPI.
Use this if you need to build fast, accurate predictive models from large, structured datasets, especially in contexts where performance and scalability are critical.
Not ideal if your primary task involves processing unstructured data like images, audio, or raw text, or if you need highly interpretable models rather than maximum predictive power.
Stars
28,121
Forks
8,847
Language
C++
License
Apache-2.0
Category
Last pushed
Mar 13, 2026
Commits (30d)
38
Dependencies
3
Reverse dependents
122
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/dmlc/xgboost"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Recent Releases
Compare
Related frameworks
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models
serengil/chefboost
A Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4.5, CART, CHAID and...