young-geng/scalax
A simple library for scaling up JAX programs
Scalax helps machine learning engineers and researchers accelerate the training of their JAX-based models. It takes your existing single-device JAX model and training code and automatically scales it across multiple GPUs or TPUs. This allows for faster experimentation and training of larger models, producing trained models from your data.
146 stars. Available on PyPI.
Use this if you are a machine learning engineer or researcher working with JAX and need to easily distribute your model training across many devices (GPUs/TPUs) without extensive code changes.
Not ideal if you are not using JAX for your machine learning models or if you only train on a single GPU/TPU.
Stars
146
Forks
11
Language
Python
License
Apache-2.0
Category
Last pushed
Nov 04, 2025
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/young-geng/scalax"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
OptimalScale/LMFlow
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
adithya-s-k/AI-Engineering.academy
Mastering Applied AI, One Concept at a Time
jax-ml/jax-llm-examples
Minimal yet performant LLM examples in pure JAX
riyanshibohra/TuneKit
Upload your data → Get a fine-tuned SLM. Free.
JIA-Lab-research/LongLoRA
Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)