davisyoshida/haiku-mup
A port of muP to JAX/Haiku
This project helps machine learning researchers and practitioners train very large neural networks more effectively. By using a specialized parameterization technique, it enables you to find optimal learning rates for smaller models that remain optimal even when scaling up to much larger models. You provide your neural network architecture and training data, and it outputs a model that scales more predictably.
No commits in the last 6 months.
Use this if you are developing large-scale neural network models and want to ensure consistent training behavior and optimal learning rates as you increase model size.
Not ideal if you are working with small models that don't require significant scaling or if you are not comfortable modifying your model's architecture and optimizer setup.
Stars
25
Forks
3
Language
Python
License
MIT
Category
Last pushed
Oct 23, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/davisyoshida/haiku-mup"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
explosion/thinc
🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
google-deepmind/optax
Optax is a gradient processing and optimization library for JAX.
patrick-kidger/diffrax
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable....
google/grain
Library for reading and processing ML training data.
patrick-kidger/equinox
Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/