FarnoushRJ/MambaLRP
[NeurIPS 2024] Official implementation of the paper "MambaLRP: Explaining Selective State Space Sequence Models" 🐍
This project helps machine learning researchers and practitioners understand why advanced sequence models, known as Mamba models, make certain predictions. It takes a trained Mamba model and its output, then shows which parts of the input data were most important for that specific prediction. This allows users to gain insight into the model's decision-making process, ensuring more reliable use in real-world applications.
No commits in the last 6 months.
Use this if you need to explain the reasoning behind predictions from your Mamba-based language models or other sequence processing applications.
Not ideal if you are working with traditional Transformer models or other deep learning architectures that are not Mamba-based.
Stars
45
Forks
8
Language
Python
License
MIT
Category
Last pushed
Nov 06, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/FarnoushRJ/MambaLRP"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kyegomez/VisionMamba
Implementation of Vision Mamba from the paper: "Vision Mamba: Efficient Visual Representation...
SiavashShams/ssamba
[SLT'24] The official implementation of SSAMBA: Self-Supervised Audio Representation Learning...
kkakkkka/MambaTalk
[NeurlPS-2024] The official code of MambaTalk: Efficient Holistic Gesture Synthesis with...
kaistmm/Audio-Mamba-AuM
Official Implementation of the work "Audio Mamba: Bidirectional State Space Model for Audio...
zs1314/SkinMamba
【ACCVW2025 Oral】Offical Pytorch Code for "SkinMamba: A Precision Skin Lesion Segmentation...