YecanLee/min-LSTM-torch

Unofficial PyTorch Implementation of "Were RNNs All We Needed?"

27
/ 100
Experimental

This project helps machine learning researchers and practitioners experiment with a compact version of a Long Short-Term Memory (LSTM) network. It takes in structured sequence data and, after training, outputs performance metrics like accuracy and loss on a validation dataset. It's designed for those exploring efficient recurrent neural network architectures.

No commits in the last 6 months.

Use this if you are a machine learning researcher or student interested in replicating or studying the 'min-LSTM' architecture from the paper 'Were RNNs All We Needed?'.

Not ideal if you need a production-ready, highly optimized, or extensively documented LSTM implementation for deployment in a real-world application.

deep-learning recurrent-neural-networks model-experimentation academic-research pytorch
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 5 / 25

How are scores calculated?

Stars

17

Forks

1

Language

Python

License

MIT

Last pushed

Mar 20, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/YecanLee/min-LSTM-torch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.