denizyuret/AutoGrad.jl
Julia port of the Python autograd package.
This tool helps Julia programmers automatically calculate derivatives of their code, which is essential for optimizing models and training machine learning algorithms. You provide your Julia functions, and it outputs the function's value along with its gradients for specified parameters. It's designed for data scientists, machine learning engineers, and researchers working with numerical optimization in Julia.
168 stars. No commits in the last 6 months.
Use this if you need to efficiently compute gradients of complex Julia functions, including those with loops and conditionals, for tasks like machine learning model training or scientific simulations.
Not ideal if you are working exclusively in Python or another language, as this is a Julia-specific library.
Stars
168
Forks
24
Language
Julia
License
—
Category
Last pushed
Nov 15, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/denizyuret/AutoGrad.jl"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
CliMA/Oceananigans.jl
🌊 Julia software for fast, friendly, flexible, ocean-flavored fluid dynamics on CPUs and GPUs
JuliaLang/julia
The Julia Programming Language
WassimTenachi/PhySO
Physical Symbolic Optimization
EnzymeAD/Enzyme.jl
Julia bindings for the Enzyme automatic differentiator
astroautomata/SymbolicRegression.jl
Distributed High-Performance Symbolic Regression in Julia