denizyuret/AutoGrad.jl

Julia port of the Python autograd package.

43
/ 100
Emerging

This tool helps Julia programmers automatically calculate derivatives of their code, which is essential for optimizing models and training machine learning algorithms. You provide your Julia functions, and it outputs the function's value along with its gradients for specified parameters. It's designed for data scientists, machine learning engineers, and researchers working with numerical optimization in Julia.

168 stars. No commits in the last 6 months.

Use this if you need to efficiently compute gradients of complex Julia functions, including those with loops and conditionals, for tasks like machine learning model training or scientific simulations.

Not ideal if you are working exclusively in Python or another language, as this is a Julia-specific library.

numerical-optimization machine-learning-engineering scientific-computing data-science gradient-descent
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

168

Forks

24

Language

Julia

License

Last pushed

Nov 15, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/denizyuret/AutoGrad.jl"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.