transformerlab/transformerlab-app

The open source research environment for AI researchers to seamlessly train, evaluate, and scale models from local hardware to GPU clusters.

68
/ 100
Established

This platform helps AI researchers efficiently train, fine-tune, and evaluate large language models and diffusion models. You input datasets and model architectures, and it outputs trained models and performance analytics. It's designed for machine learning researchers working individually or in teams, managing complex experiments on various hardware.

4,820 stars. Actively maintained with 1,270 commits in the last 30 days.

Use this if you are an AI researcher who needs a unified environment to manage the full lifecycle of training, evaluating, and deploying foundation models, from a single machine to a GPU cluster.

Not ideal if you are looking for a simple tool for basic model inference without needing to train, fine-tune, or rigorously evaluate models.

AI research ML model training LLM fine-tuning diffusion models ML experiment management
No Package No Dependents
Maintenance 22 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

4,820

Forks

501

Language

Python

License

AGPL-3.0

Last pushed

Mar 12, 2026

Commits (30d)

1270

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/transformerlab/transformerlab-app"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.