Swiggy/Moo-GBT
Library for Multi-objective optimization in Gradient Boosted Trees
This project helps data scientists and machine learning engineers fine-tune gradient-boosted tree models, ensuring they meet multiple performance targets simultaneously. You provide your training data with a primary outcome and one or more secondary outcomes (e.g., 'is_booking' as primary and 'is_package' as secondary). The tool then outputs a trained model that optimizes the primary objective while keeping the secondary objectives' performance within specified upper bounds.
No commits in the last 6 months. Available on PyPI.
Use this if you need to build predictive models where optimizing a main goal is important, but you also have critical secondary metrics that must not degrade beyond a certain threshold, like in recommendation systems or fraud detection.
Not ideal if your problem only involves optimizing a single objective, or if you prefer a different type of machine learning model than gradient-boosted trees.
Stars
78
Forks
15
Language
Python
License
BSD-3-Clause
Category
Last pushed
Aug 19, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Swiggy/Moo-GBT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models