young-geng/scalax

A simple library for scaling up JAX programs

52
/ 100
Established

Scalax helps machine learning engineers and researchers accelerate the training of their JAX-based models. It takes your existing single-device JAX model and training code and automatically scales it across multiple GPUs or TPUs. This allows for faster experimentation and training of larger models, producing trained models from your data.

146 stars. Available on PyPI.

Use this if you are a machine learning engineer or researcher working with JAX and need to easily distribute your model training across many devices (GPUs/TPUs) without extensive code changes.

Not ideal if you are not using JAX for your machine learning models or if you only train on a single GPU/TPU.

distributed-training machine-learning-engineering deep-learning large-model-training AI-research
Maintenance 6 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 11 / 25

How are scores calculated?

Stars

146

Forks

11

Language

Python

License

Apache-2.0

Category

llm-fine-tuning

Last pushed

Nov 04, 2025

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/young-geng/scalax"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.