jax-ml/jax-llm-examples
Minimal yet performant LLM examples in pure JAX
This project provides pre-built, high-performance examples of various large language models (LLMs) like Llama, DeepSeek, and Qwen. It allows AI researchers and engineers to quickly set up and experiment with different LLM architectures, taking in model configurations and training data to produce trained, ready-to-use LLM models.
244 stars.
Use this if you are an AI researcher or engineer looking for performant, ready-to-run examples of large language models for experimentation or fine-tuning.
Not ideal if you are an end-user looking for a pre-trained, production-ready LLM application rather than the underlying model implementations.
Stars
244
Forks
32
Language
Python
License
Apache-2.0
Category
Last pushed
Jan 14, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/jax-ml/jax-llm-examples"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
OptimalScale/LMFlow
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
adithya-s-k/AI-Engineering.academy
Mastering Applied AI, One Concept at a Time
young-geng/scalax
A simple library for scaling up JAX programs
riyanshibohra/TuneKit
Upload your data → Get a fine-tuned SLM. Free.
JIA-Lab-research/LongLoRA
Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)