VITA-Group/SMC-Bench
[ICLR 2023] "Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!" Shiwei Liu, Tianlong Chen, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, Zhangyang Wang
This project helps machine learning researchers and practitioners evaluate the effectiveness of sparse neural networks across various complex tasks. It takes existing deep learning models and applies different sparsity algorithms to them. The output provides insights into how well these models perform with reduced complexity for tasks like commonsense reasoning, multilingual translation, or protein prediction. Researchers and engineers working with large neural networks would use this to understand the true potential of sparsity.
No commits in the last 6 months.
Use this if you are a machine learning researcher or engineer interested in assessing and comparing the performance of different sparse neural network algorithms on diverse, challenging datasets.
Not ideal if you are looking for a tool to train a new sparse model from scratch or for general-purpose model training outside of benchmarking sparsity.
Stars
28
Forks
3
Language
Python
License
—
Category
Last pushed
Aug 29, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/VITA-Group/SMC-Bench"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
InterDigitalInc/CompressAI
A PyTorch library and evaluation platform for end-to-end compression research
quic/aimet
AIMET is a library that provides advanced quantization and compression techniques for trained...
tensorflow/compression
Data compression in TensorFlow
baler-collaboration/baler
Repository of Baler, a machine learning based data compression tool
thulab/DeepHash
An Open-Source Package for Deep Learning to Hash (DeepHash)