HITESHLPATEL/Mamba-Papers
Awesome Mamba Papers: A Curated Collection of Research Papers , Tutorials & Blogs
This collection helps researchers and practitioners explore the latest advancements in Mamba models, which are powerful alternatives to Transformer models for various AI tasks. It centralizes research papers, code implementations, and educational resources like videos and blogs, making it easier to discover and understand how Mamba models are being applied. Anyone working with AI models for tasks like image segmentation, medical imaging, or general sequence modeling would find this useful.
No commits in the last 6 months.
Use this if you are an AI researcher or machine learning engineer looking for a comprehensive overview and resources on Mamba models and their applications.
Not ideal if you are looking for a plug-and-play AI tool or an introductory guide to machine learning concepts.
Stars
26
Forks
5
Language
—
License
—
Category
Last pushed
Mar 25, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/HITESHLPATEL/Mamba-Papers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
zhuhanqing/APOLLO
APOLLO: SGD-like Memory, AdamW-level Performance; MLSys'25 Oustanding Paper Honorable Mention
zhenye234/xcodec
AAAI 2025: Codec Does Matter: Exploring the Semantic Shortcoming of Codec for Audio Language Model
Y-Research-SBU/CSRv2
Official Repository for CSRv2 - ICLR 2026
psychofict/llm-effective-context-length
Investigating Why the Effective Context Length of LLMs Falls Short (Based on STRING, ICLR 2025)
rishikksh20/mamba3-pytorch
Readable implementation of Mamba 3 SSM model