dfdx/Yota.jl

Reverse-mode automatic differentiation in Julia

38
/ 100
Emerging

Yota.jl helps Julia developers efficiently calculate gradients for complex mathematical functions, especially those found in deep learning models. You provide your Julia code for a function and its inputs, and it returns the gradients needed for optimization. This is for machine learning engineers, data scientists, and researchers working with neural networks and other differentiable programs in Julia.

159 stars. No commits in the last 6 months.

Use this if you are a Julia developer building or optimizing deep learning models and need an efficient way to compute gradients for large inputs.

Not ideal if you are not a Julia developer or if your primary need is for forward-mode automatic differentiation.

deep-learning machine-learning-optimization neural-networks scientific-computing gradient-calculation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 12 / 25

How are scores calculated?

Stars

159

Forks

12

Language

Julia

License

MIT

Last pushed

Aug 05, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/dfdx/Yota.jl"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.