gsurma/cnn_explainer
Making CNNs interpretable.
This project helps you understand why an image classification model made a specific decision. You provide an image and your trained image classification model, and it generates visual explanations like heatmaps or feature visualizations. This is useful for AI/ML practitioners, researchers, or data scientists who need to audit or explain the behavior of their computer vision models.
No commits in the last 6 months.
Use this if you need to visualize and interpret the decision-making process of your Convolutional Neural Networks for image classification tasks.
Not ideal if you are working with other types of neural networks or data beyond images, or if you need to optimize model performance rather than explain it.
Stars
19
Forks
2
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jul 09, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/gsurma/cnn_explainer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jacobgil/pytorch-grad-cam
Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers,...
frgfm/torch-cam
Class activation maps for your PyTorch models (CAM, Grad-CAM, Grad-CAM++, Smooth Grad-CAM++,...
jacobgil/keras-grad-cam
An implementation of Grad-CAM with keras
ramprs/grad-cam
[ICCV 2017] Torch code for Grad-CAM
innat/HybridModel-GradCAM
A Keras implementation of hybrid efficientnet swin transformer model.