Swiggy/Moo-GBT

Library for Multi-objective optimization in Gradient Boosted Trees

52
/ 100
Established

This project helps data scientists and machine learning engineers fine-tune gradient-boosted tree models, ensuring they meet multiple performance targets simultaneously. You provide your training data with a primary outcome and one or more secondary outcomes (e.g., 'is_booking' as primary and 'is_package' as secondary). The tool then outputs a trained model that optimizes the primary objective while keeping the secondary objectives' performance within specified upper bounds.

No commits in the last 6 months. Available on PyPI.

Use this if you need to build predictive models where optimizing a main goal is important, but you also have critical secondary metrics that must not degrade beyond a certain threshold, like in recommendation systems or fraud detection.

Not ideal if your problem only involves optimizing a single objective, or if you prefer a different type of machine learning model than gradient-boosted trees.

predictive-modeling recommendation-systems ranking fraud-detection constrained-optimization
Stale 6m No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 25 / 25
Community 18 / 25

How are scores calculated?

Stars

78

Forks

15

Language

Python

License

BSD-3-Clause

Last pushed

Aug 19, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Swiggy/Moo-GBT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.