serengil/chefboost
A Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4.5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting, Random Forest and Adaboost w/categorical features support for Python
This tool helps data analysts and domain experts create clear, rule-based models from their data. You input a dataset, often with both numbers and categories, and it outputs a set of 'if-then' rules that explain predictions. This is ideal for someone who needs to understand the logic behind a classification or prediction, rather than just getting an answer.
486 stars. No commits in the last 6 months. Available on PyPI.
Use this if you need to build interpretable decision-making models from your data, especially if your data includes important categorical information.
Not ideal if you're looking for a simple, out-of-the-box solution without any programming or data handling knowledge.
Stars
486
Forks
101
Language
Python
License
MIT
Category
Last pushed
Jul 09, 2025
Commits (30d)
0
Dependencies
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/serengil/chefboost"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models