xgboost and GBM-perf

XGBoost is a gradient boosting implementation that GBM-perf benchmarks and compares against other GBM frameworks to evaluate relative performance.

xgboost
85
Verified
GBM-perf
53
Established
Maintenance 20/25
Adoption 15/25
Maturity 25/25
Community 25/25
Maintenance 10/25
Adoption 10/25
Maturity 16/25
Community 17/25
Stars: 28,121
Forks: 8,847
Downloads:
Commits (30d): 38
Language: C++
License: Apache-2.0
Stars: 224
Forks: 30
Downloads:
Commits (30d): 0
Language: HTML
License: MIT
No risk flags
No Package No Dependents

About xgboost

dmlc/xgboost

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

XGBoost helps data scientists and machine learning engineers quickly build highly accurate predictive models for classification, regression, and ranking tasks. It takes structured datasets (like spreadsheets or database tables) and outputs a powerful model capable of making predictions. This tool is ideal for professionals who need to develop robust and efficient predictive analytics solutions.

predictive-modeling machine-learning-engineering data-science business-forecasting risk-assessment

About GBM-perf

szilard/GBM-perf

Performance of various open source GBM implementations

This project helps data scientists and machine learning engineers understand the real-world performance of popular gradient boosting machine (GBM) implementations. It compares training times and accuracy (AUC) of tools like H2O, XGBoost, LightGBM, and CatBoost on large datasets, using both CPU and GPU hardware configurations. You can use this to make informed decisions about which GBM library will perform best for your specific predictive modeling tasks.

predictive-modeling machine-learning-engineering performance-benchmarking data-science boosted-trees

Scores updated daily from GitHub, PyPI, and npm data. How scores work