xgboost and GBM-perf
XGBoost is a gradient boosting implementation that GBM-perf benchmarks and compares against other GBM frameworks to evaluate relative performance.
About xgboost
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
XGBoost helps data scientists and machine learning engineers quickly build highly accurate predictive models for classification, regression, and ranking tasks. It takes structured datasets (like spreadsheets or database tables) and outputs a powerful model capable of making predictions. This tool is ideal for professionals who need to develop robust and efficient predictive analytics solutions.
About GBM-perf
szilard/GBM-perf
Performance of various open source GBM implementations
This project helps data scientists and machine learning engineers understand the real-world performance of popular gradient boosting machine (GBM) implementations. It compares training times and accuracy (AUC) of tools like H2O, XGBoost, LightGBM, and CatBoost on large datasets, using both CPU and GPU hardware configurations. You can use this to make informed decisions about which GBM library will perform best for your specific predictive modeling tasks.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work