marfvr/micrograd-js
A porting of Karpathy's Micrograd to JS
This is a JavaScript library that provides an automatic differentiation engine, helping developers implement and train neural networks or other mathematical models directly in JavaScript. It takes numerical inputs and automatically calculates gradients, which are crucial for optimizing model parameters. This is for software developers who are building web-based machine learning applications or interactive data science tools.
No commits in the last 6 months.
Use this if you are a JavaScript developer who needs to build simple neural networks or other gradient-based optimization algorithms directly in a web browser or Node.js environment.
Not ideal if you need a full-fledged deep learning framework with extensive pre-built layers, complex model architectures, or GPU acceleration, as this is a minimal library.
Stars
8
Forks
—
Language
TypeScript
License
MIT
Category
Last pushed
Jun 02, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/marfvr/micrograd-js"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
JonathanRaiman/theano_lstm
:microscope: Nano size Theano LSTM module
google/tangent
Source-to-Source Debuggable Derivatives in Pure Python
ahrefs/ocannl
OCANNL: OCaml Compiles Algorithms for Neural Networks Learning
yoshoku/numo-openblas
Numo::OpenBLAS builds and uses OpenBLAS as a background library for Numo::Linalg
statusfailed/catgrad
a categorical deep learning compiler