davisyoshida/haiku-mup

A port of muP to JAX/Haiku

33
/ 100
Emerging

This project helps machine learning researchers and practitioners train very large neural networks more effectively. By using a specialized parameterization technique, it enables you to find optimal learning rates for smaller models that remain optimal even when scaling up to much larger models. You provide your neural network architecture and training data, and it outputs a model that scales more predictably.

No commits in the last 6 months.

Use this if you are developing large-scale neural network models and want to ensure consistent training behavior and optimal learning rates as you increase model size.

Not ideal if you are working with small models that don't require significant scaling or if you are not comfortable modifying your model's architecture and optimizer setup.

deep-learning neural-networks model-scaling hyperparameter-tuning machine-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

25

Forks

3

Language

Python

License

MIT

Last pushed

Oct 23, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/davisyoshida/haiku-mup"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.