StatMixedML/DGBM
Distributional Gradient Boosting Machines
When you need to predict not just a single value, but an entire range of possible outcomes for a numerical variable, this tool helps. It takes your existing data with predictor variables and a numerical outcome, and generates a full probability distribution for that outcome. This allows statisticians, data scientists, and forecasters to understand the uncertainty in their predictions and derive prediction intervals or specific quantiles.
No commits in the last 6 months.
Use this if you need to understand the full spectrum of potential results and their likelihood for a numerical prediction, rather than just a single average prediction.
Not ideal if you only need a single point prediction for a numerical value and are not concerned with the uncertainty or range of possible outcomes.
Stars
28
Forks
2
Language
—
License
Apache-2.0
Category
Last pushed
Dec 13, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/StatMixedML/DGBM"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models