eliben/radgrad

Tracing-based reverse mode automatic differentiation (like autograd!)

27
/ 100
Experimental

This tool helps researchers and engineers understand and implement automatic differentiation (AD) for mathematical functions. You provide a Python function that uses a NumPy-like syntax, and it outputs another function that calculates its derivative. It's intended for students, educators, and anyone who needs to grasp the mechanics of how AD systems like those in machine learning frameworks work.

No commits in the last 6 months.

Use this if you want to learn the internal workings of reverse-mode automatic differentiation from first principles, using practical Python examples.

Not ideal if you need a production-ready automatic differentiation library; consider established frameworks like PyTorch or TensorFlow for robust solutions.

numerical-analysis scientific-computing computational-mathematics algorithm-education machine-learning-fundamentals
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 4 / 25

How are scores calculated?

Stars

27

Forks

1

Language

Python

License

Unlicense

Last pushed

Feb 02, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/eliben/radgrad"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.