ramprs/grad-cam
[ICCV 2017] Torch code for Grad-CAM
This tool helps researchers and AI practitioners understand why an image recognition model makes a specific decision. You input an image and the model's classification, question-answering, or image captioning result. It outputs a visual heatmap highlighting the exact regions in the image that most influenced the model's output, offering transparency into its "thought process." This is for anyone who needs to debug, explain, or trust the decisions of a deep learning model working with images.
1,634 stars. No commits in the last 6 months.
Use this if you need to visually interpret the critical pixels or regions in an image that a convolutional neural network (CNN) used to arrive at its classification, answer to a question, or image caption.
Not ideal if you are looking for a general-purpose explainability tool for non-image data or for model architectures other than CNNs.
Stars
1,634
Forks
236
Language
Lua
License
—
Category
Last pushed
Sep 17, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ramprs/grad-cam"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jacobgil/pytorch-grad-cam
Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers,...
frgfm/torch-cam
Class activation maps for your PyTorch models (CAM, Grad-CAM, Grad-CAM++, Smooth Grad-CAM++,...
jacobgil/keras-grad-cam
An implementation of Grad-CAM with keras
matlab-deep-learning/Explore-Deep-Network-Explainability-Using-an-App
This repository provides an app for exploring the predictions of an image classification network...
innat/HybridModel-GradCAM
A Keras implementation of hybrid efficientnet swin transformer model.