trekhleb/micrograd-ts
🤖 A TypeScript version of karpathy/micrograd — a tiny scalar-valued autograd engine and a neural net on top of it
This tool helps software developers understand the core mechanics of neural networks and how they learn. It takes basic mathematical operations and builds them into a simple neural network structure. The output is a clear visualization of how derivatives are calculated and how a network trains by adjusting its internal parameters.
No commits in the last 6 months.
Use this if you are a software developer who wants to deeply understand the foundational math behind neural networks and backpropagation, especially in a TypeScript environment.
Not ideal if you are looking for a high-level library to build complex neural networks for production applications without needing to understand the underlying math.
Stars
72
Forks
9
Language
TypeScript
License
MIT
Category
Last pushed
Sep 10, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/trekhleb/micrograd-ts"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
JonathanRaiman/theano_lstm
:microscope: Nano size Theano LSTM module
google/tangent
Source-to-Source Debuggable Derivatives in Pure Python
ahrefs/ocannl
OCANNL: OCaml Compiles Algorithms for Neural Networks Learning
yoshoku/numo-openblas
Numo::OpenBLAS builds and uses OpenBLAS as a background library for Numo::Linalg
statusfailed/catgrad
a categorical deep learning compiler