google/jaxopt
Hardware accelerated, batchable and differentiable optimizers in JAX.
This project provides pre-built optimization algorithms that can run quickly on specialized computer hardware like GPUs and TPUs. It takes in a mathematical optimization problem and efficiently finds the best solution, even for many similar problems at once. This is for machine learning engineers, researchers, and data scientists who are building or experimenting with custom machine learning models or complex data analysis systems.
1,030 stars. Used by 3 other packages. Available on PyPI.
Use this if you are a machine learning practitioner or researcher building custom models and need to integrate highly efficient, differentiable optimization routines into your JAX-based workflows, especially if you work with large datasets or computationally intensive tasks on specialized hardware.
Not ideal if you are looking for a currently maintained or actively developed optimization library, as this project is no longer supported and you should explore alternatives like Optax.
Stars
1,030
Forks
72
Language
Python
License
Apache-2.0
Category
Last pushed
Dec 17, 2025
Commits (30d)
0
Dependencies
4
Reverse dependents
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/google/jaxopt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related frameworks
explosion/thinc
🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
google-deepmind/optax
Optax is a gradient processing and optimization library for JAX.
patrick-kidger/diffrax
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable....
google/grain
Library for reading and processing ML training data.
patrick-kidger/equinox
Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/