koalaverse/vip

Variable Importance Plots (VIPs)

40
/ 100
Emerging

This tool helps data scientists and machine learning engineers understand which factors are most influential in their predictive models. You provide a trained machine learning model (like a regression, random forest, or neural network) and it generates clear, publication-ready plots that show the relative importance of each input variable. This allows you to explain complex models to stakeholders and gain insights into underlying relationships.

190 stars.

Use this if you need to explain the decisions of your machine learning models by identifying which features contribute most to their predictions, without needing to learn different methods for each model type.

Not ideal if you are not working with R or if you primarily need to build predictive models rather than interpret existing ones.

machine-learning-interpretability data-science model-explanation feature-importance predictive-analytics
No License No Package No Dependents
Maintenance 6 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 16 / 25

How are scores calculated?

Stars

190

Forks

23

Language

R

License

Last pushed

Dec 12, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/koalaverse/vip"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.