google/paxml

Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry leading model flop utilization rates.

71
/ 100
Verified

Paxml helps machine learning engineers efficiently train very large-scale deep learning models, particularly large language models like GPT-3, on Google Cloud TPUs. It takes your model configurations and training datasets, and outputs trained models along with performance metrics like loss curves and perplexity. This framework is for machine learning practitioners and researchers working with massive datasets and models requiring significant computational resources.

550 stars. Actively maintained with 7 commits in the last 30 days. Available on PyPI.

Use this if you are a machine learning engineer or researcher training large-scale deep learning models, especially large language models, and need to optimize their performance and computational efficiency on Google Cloud TPUs.

Not ideal if you are working with smaller models, do not have access to Google Cloud TPUs, or prefer a framework that offers broader GPU support out-of-the-box (though an NVIDIA-optimized version exists for GPUs).

large-language-models deep-learning-training cloud-tpu model-optimization machine-learning-research
Maintenance 17 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 19 / 25

How are scores calculated?

Stars

550

Forks

70

Language

Python

License

Apache-2.0

Category

llm-fine-tuning

Last pushed

Mar 12, 2026

Commits (30d)

7

Dependencies

20

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/google/paxml"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.