KentaItakura/Explainable-AI-interpreting-the-classification-performed-by-deep-learning-with-LIME-using-MATLAB
This demo shows how to interpret the classification by CNN using LIME (Local Interpretable Model-agnostic Explanations)
This tool helps scientists, engineers, and researchers understand *why* a deep learning model classified an image the way it did. You input an image and a classification from a deep learning model, and it outputs a visual overlay on the original image, highlighting the specific regions that most influenced that classification. This is ideal for anyone working with image classification who needs to verify model trustworthiness or diagnose model errors.
No commits in the last 6 months.
Use this if you need to explain or interpret the decision-making process of a Convolutional Neural Network (CNN) for image classification.
Not ideal if you are working with non-image data or deep learning models other than CNNs, or if you require an explanation method beyond LIME.
Stars
13
Forks
1
Language
MATLAB
License
BSD-3-Clause
Category
Last pushed
Dec 06, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/KentaItakura/Explainable-AI-interpreting-the-classification-performed-by-deep-learning-with-LIME-using-MATLAB"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jacobgil/pytorch-grad-cam
Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers,...
frgfm/torch-cam
Class activation maps for your PyTorch models (CAM, Grad-CAM, Grad-CAM++, Smooth Grad-CAM++,...
jacobgil/keras-grad-cam
An implementation of Grad-CAM with keras
ramprs/grad-cam
[ICCV 2017] Torch code for Grad-CAM
matlab-deep-learning/Explore-Deep-Network-Explainability-Using-an-App
This repository provides an app for exploring the predictions of an image classification network...