CaptainE/RNN-LSTM-in-numpy

Building a RNN and LSTM from scratch with NumPy.

37
/ 100
Emerging

This project helps deep learning students and researchers understand how recurrent neural networks (RNNs) and Long Short-Term Memory (LSTM) networks process sequential data. It takes in simple sequences of characters and outputs predictions for the next character, demonstrating the internal workings of these models. This is for individuals learning or teaching the foundational mechanics of deep learning models for sequence prediction.

No commits in the last 6 months.

Use this if you are studying or teaching the core principles of RNNs and LSTMs and need a hands-on, step-by-step implementation from scratch to grasp their internal mechanics.

Not ideal if you need to build or apply a production-ready sequence prediction model for a real-world dataset, as this is an educational implementation focused on learning.

deep-learning-education neural-network-fundamentals sequential-data-modeling language-modeling-basics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

42

Forks

6

Language

Jupyter Notebook

License

GPL-3.0

Last pushed

Jun 27, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/CaptainE/RNN-LSTM-in-numpy"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.