Sea-Snell/JAXSeq

Train very large language models in Jax.

39
/ 100
Emerging

This project helps machine learning engineers and researchers efficiently train very large language models like GPT2, GPTJ, T5, and OPT using the Jax framework. You provide your training and evaluation data in a specific JSONL format, and the project outputs a trained language model. It's designed for users who need to handle massive models and datasets across multiple GPUs or TPUs.

210 stars. No commits in the last 6 months.

Use this if you are an ML engineer or researcher who needs to train large language models on distributed hardware (like TPU pods or GPU clusters) and requires flexible control over model and data parallelism.

Not ideal if you are looking for a high-level, opinionated framework that abstracts away many details of distributed training, or if you only need to fine-tune smaller models on a single device.

large-language-models distributed-training natural-language-processing deep-learning model-training
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

210

Forks

17

Language

Python

License

MIT

Last pushed

Oct 21, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Sea-Snell/JAXSeq"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.