Anri-Lombard/micrograd
Building Andrej Kapathy's micrograd from scratch
This project helps aspiring machine learning practitioners and students understand how deep learning works at a fundamental level. It takes basic mathematical inputs and, through a simplified neural network, shows how predictions are made and then refined. The ideal user is someone learning deep learning from scratch.
No commits in the last 6 months.
Use this if you want to truly understand the core mechanics of neural network training, including how gradients are calculated and weights are updated, without the complexities of production-ready libraries.
Not ideal if you need to build and train large, efficient, or production-grade deep learning models.
Stars
47
Forks
3
Language
Jupyter Notebook
License
—
Category
Last pushed
May 13, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Anri-Lombard/micrograd"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
JonathanRaiman/theano_lstm
:microscope: Nano size Theano LSTM module
google/tangent
Source-to-Source Debuggable Derivatives in Pure Python
ahrefs/ocannl
OCANNL: OCaml Compiles Algorithms for Neural Networks Learning
yoshoku/numo-openblas
Numo::OpenBLAS builds and uses OpenBLAS as a background library for Numo::Linalg
statusfailed/catgrad
a categorical deep learning compiler