YecanLee/min-LSTM-torch
Unofficial PyTorch Implementation of "Were RNNs All We Needed?"
This project helps machine learning researchers and practitioners experiment with a compact version of a Long Short-Term Memory (LSTM) network. It takes in structured sequence data and, after training, outputs performance metrics like accuracy and loss on a validation dataset. It's designed for those exploring efficient recurrent neural network architectures.
No commits in the last 6 months.
Use this if you are a machine learning researcher or student interested in replicating or studying the 'min-LSTM' architecture from the paper 'Were RNNs All We Needed?'.
Not ideal if you need a production-ready, highly optimized, or extensively documented LSTM implementation for deployment in a real-world application.
Stars
17
Forks
1
Language
Python
License
MIT
Category
Last pushed
Mar 20, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/YecanLee/min-LSTM-torch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kk7nc/RMDL
RMDL: Random Multimodel Deep Learning for Classification
MaximeVandegar/Papers-in-100-Lines-of-Code
Implementation of papers in 100 lines of code.
OML-Team/open-metric-learning
Metric learning and retrieval pipelines, models and zoo.
miguelvr/dropblock
Implementation of DropBlock: A regularization method for convolutional networks in PyTorch.
DLTK/DLTK
Deep Learning Toolkit for Medical Image Analysis