AmirhosseinHonardoust/Shap-Mini

A minimal, reproducible explainable-AI demo using SHAP values on tabular data. Trains RandomForest or LogisticRegression models, computes global and local feature importances, and visualizes results through summary and dependence plots, all in under 100 lines of Python.

30
/ 100
Emerging

When you have a machine learning model built on a table of data and need to understand why it makes certain predictions, this project helps. It takes your tabular data and a trained model, then generates visual explanations like 'which features are most important overall' and 'how a specific feature impacts predictions'. This is for data analysts, business intelligence professionals, or anyone who uses simple classification models and needs to explain their decisions clearly to stakeholders.

Use this if you need to quickly generate clear, visual explanations for a tabular machine learning model's predictions, especially when using common models like Random Forest or Logistic Regression.

Not ideal if your models are complex deep learning networks, handle image/text data, or if you need highly customized, interactive explainability dashboards.

predictive-modeling model-auditing feature-importance decision-explanation business-intelligence
No Package No Dependents
Maintenance 6 / 25
Adoption 6 / 25
Maturity 13 / 25
Community 5 / 25

How are scores calculated?

Stars

20

Forks

1

Language

Python

License

MIT

Last pushed

Nov 09, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AmirhosseinHonardoust/Shap-Mini"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.