eliben/radgrad
Tracing-based reverse mode automatic differentiation (like autograd!)
This tool helps researchers and engineers understand and implement automatic differentiation (AD) for mathematical functions. You provide a Python function that uses a NumPy-like syntax, and it outputs another function that calculates its derivative. It's intended for students, educators, and anyone who needs to grasp the mechanics of how AD systems like those in machine learning frameworks work.
No commits in the last 6 months.
Use this if you want to learn the internal workings of reverse-mode automatic differentiation from first principles, using practical Python examples.
Not ideal if you need a production-ready automatic differentiation library; consider established frameworks like PyTorch or TensorFlow for robust solutions.
Stars
27
Forks
1
Language
Python
License
Unlicense
Category
Last pushed
Feb 02, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/eliben/radgrad"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
JonathanRaiman/theano_lstm
:microscope: Nano size Theano LSTM module
google/tangent
Source-to-Source Debuggable Derivatives in Pure Python
ahrefs/ocannl
OCANNL: OCaml Compiles Algorithms for Neural Networks Learning
yoshoku/numo-openblas
Numo::OpenBLAS builds and uses OpenBLAS as a background library for Numo::Linalg
statusfailed/catgrad
a categorical deep learning compiler