dfdx/Yota.jl
Reverse-mode automatic differentiation in Julia
Yota.jl helps Julia developers efficiently calculate gradients for complex mathematical functions, especially those found in deep learning models. You provide your Julia code for a function and its inputs, and it returns the gradients needed for optimization. This is for machine learning engineers, data scientists, and researchers working with neural networks and other differentiable programs in Julia.
159 stars. No commits in the last 6 months.
Use this if you are a Julia developer building or optimizing deep learning models and need an efficient way to compute gradients for large inputs.
Not ideal if you are not a Julia developer or if your primary need is for forward-mode automatic differentiation.
Stars
159
Forks
12
Language
Julia
License
MIT
Category
Last pushed
Aug 05, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/dfdx/Yota.jl"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
CliMA/Oceananigans.jl
🌊 Julia software for fast, friendly, flexible, ocean-flavored fluid dynamics on CPUs and GPUs
JuliaLang/julia
The Julia Programming Language
WassimTenachi/PhySO
Physical Symbolic Optimization
EnzymeAD/Enzyme.jl
Julia bindings for the Enzyme automatic differentiator
astroautomata/SymbolicRegression.jl
Distributed High-Performance Symbolic Regression in Julia