camail-official/compressm
This is the official repository for the ICLR 2026 paper "The Curious Case of In-Training Compression of State Space Models".
This project helps machine learning researchers and practitioners optimize State Space Models (SSMs) for long sequence data. It takes raw training data for tasks like image classification (sMNIST, sCIFAR), text analysis (IMDB, AAN), or sequence operations (ListOps, Pathfinder) and outputs a smaller, computationally more efficient model that maintains high performance. The end users are researchers or engineers working on deep learning models for sequential data.
Use this if you are training State Space Models and need to reduce their computational cost and memory footprint during the training process without sacrificing significant performance.
Not ideal if you are working with other types of deep learning architectures (e.g., Transformers, CNNs) or if you are not comfortable with experimental machine learning research.
Stars
12
Forks
—
Language
Python
License
MIT
Category
Last pushed
Feb 23, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/camail-official/compressm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
InterDigitalInc/CompressAI
A PyTorch library and evaluation platform for end-to-end compression research
quic/aimet
AIMET is a library that provides advanced quantization and compression techniques for trained...
tensorflow/compression
Data compression in TensorFlow
baler-collaboration/baler
Repository of Baler, a machine learning based data compression tool
thulab/DeepHash
An Open-Source Package for Deep Learning to Hash (DeepHash)