noway/yagrad
yet another scalar autograd engine - featuring complex numbers and fixed DAG
This is a minimalist automatic differentiation engine designed for those learning about how neural networks compute gradients. It takes simple mathematical expressions and demonstrates how gradients are calculated, including for complex numbers. It's ideal for students or educators exploring the foundational mechanics of machine learning.
No commits in the last 6 months.
Use this if you are studying the core principles of backpropagation and gradient computation in a simple, understandable codebase.
Not ideal if you need a robust, production-ready machine learning framework for building or training complex models.
Stars
26
Forks
2
Language
Python
License
MIT
Category
Last pushed
Mar 20, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/noway/yagrad"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
JonathanRaiman/theano_lstm
:microscope: Nano size Theano LSTM module
google/tangent
Source-to-Source Debuggable Derivatives in Pure Python
ahrefs/ocannl
OCANNL: OCaml Compiles Algorithms for Neural Networks Learning
yoshoku/numo-openblas
Numo::OpenBLAS builds and uses OpenBLAS as a background library for Numo::Linalg
statusfailed/catgrad
a categorical deep learning compiler