ramprs/grad-cam

[ICCV 2017] Torch code for Grad-CAM

40
/ 100
Emerging

This tool helps researchers and AI practitioners understand why an image recognition model makes a specific decision. You input an image and the model's classification, question-answering, or image captioning result. It outputs a visual heatmap highlighting the exact regions in the image that most influenced the model's output, offering transparency into its "thought process." This is for anyone who needs to debug, explain, or trust the decisions of a deep learning model working with images.

1,634 stars. No commits in the last 6 months.

Use this if you need to visually interpret the critical pixels or regions in an image that a convolutional neural network (CNN) used to arrive at its classification, answer to a question, or image caption.

Not ideal if you are looking for a general-purpose explainability tool for non-image data or for model architectures other than CNNs.

AI explainability computer vision deep learning interpretation image classification analysis visual question answering
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 22 / 25

How are scores calculated?

Stars

1,634

Forks

236

Language

Lua

License

Last pushed

Sep 17, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ramprs/grad-cam"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.