pytorch/xla
Enabling PyTorch on XLA Devices (e.g. Google TPU)
PyTorch/XLA helps machine learning engineers and researchers accelerate their deep learning models. It takes models built with PyTorch and data, then runs them efficiently on Google's Cloud TPUs. This allows for faster training and experimentation, particularly for large-scale models.
2,756 stars.
Use this if you are a machine learning practitioner looking to significantly speed up your PyTorch model training by leveraging Google's powerful Cloud TPUs.
Not ideal if your deep learning workflows don't involve PyTorch or you do not have access to or prefer not to use Google Cloud TPUs.
Stars
2,756
Forks
566
Language
C++
License
—
Category
Last pushed
Dec 18, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/pytorch/xla"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
opthub-org/pytorch-bsf
PyTorch implementation of Bezier simplex fitting
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch
stanford-centaur/PyPantograph
A Machine-to-Machine Interaction System for Lean 4.