noway/yagrad

yet another scalar autograd engine - featuring complex numbers and fixed DAG

30
/ 100
Emerging

This is a minimalist automatic differentiation engine designed for those learning about how neural networks compute gradients. It takes simple mathematical expressions and demonstrates how gradients are calculated, including for complex numbers. It's ideal for students or educators exploring the foundational mechanics of machine learning.

No commits in the last 6 months.

Use this if you are studying the core principles of backpropagation and gradient computation in a simple, understandable codebase.

Not ideal if you need a robust, production-ready machine learning framework for building or training complex models.

Machine Learning Education Deep Learning Fundamentals Automatic Differentiation Computational Mathematics Neural Network Theory
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

26

Forks

2

Language

Python

License

MIT

Last pushed

Mar 20, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/noway/yagrad"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.