optax and jaxopt
These two libraries are **complements**, as Optax focuses on gradient-based optimization algorithms and utilities, while JAXopt provides a broader range of differentiable and hardware-accelerated optimizers, including non-gradient-based methods, which can leverage or be combined with Optax's gradient processing capabilities.
About optax
google-deepmind/optax
Optax is a gradient processing and optimization library for JAX.
This library helps machine learning researchers efficiently train their neural networks. It takes a JAX-based model's parameters and gradients, applies various optimization techniques, and outputs updated parameters to improve model performance. It's used by machine learning practitioners and researchers who build and experiment with custom deep learning models.
About jaxopt
google/jaxopt
Hardware accelerated, batchable and differentiable optimizers in JAX.
This project provides pre-built optimization algorithms that can run quickly on specialized computer hardware like GPUs and TPUs. It takes in a mathematical optimization problem and efficiently finds the best solution, even for many similar problems at once. This is for machine learning engineers, researchers, and data scientists who are building or experimenting with custom machine learning models or complex data analysis systems.
Scores updated daily from GitHub, PyPI, and npm data. How scores work