linkedin/TE2Rules

Python library to explain Tree Ensemble models (TE) like XGBoost, using a rule list.

45
/ 100
Emerging

When you have a machine learning model, like XGBoost or Random Forest, making important predictions (e.g., in finance or healthcare), it can be hard to understand *why* it made a specific decision. This project helps you get clear, human-readable rules that explain what conditions lead the model to classify something a certain way. It takes your trained model and the data it learned from, and outputs a concise list of 'if-then' rules, which data scientists or domain experts can use to understand and validate the model's logic.

No commits in the last 6 months. Available on PyPI.

Use this if you need to understand the underlying logic of your Tree Ensemble models (like XGBoost or Random Forest) for binary classification tasks, especially in high-stakes domains where transparency and trust are crucial.

Not ideal if your model is not a Tree Ensemble (e.g., a neural network) or if you are working on a regression task rather than binary classification.

Machine Learning Interpretability Model Explanation Healthcare AI Financial Risk Assessment Compliance
Stale 6m
Maintenance 0 / 25
Adoption 8 / 25
Maturity 25 / 25
Community 12 / 25

How are scores calculated?

Stars

63

Forks

7

Language

Python

License

Last pushed

Apr 22, 2024

Commits (30d)

0

Dependencies

6

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/linkedin/TE2Rules"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.