transformerlab/transformerlab-app
The open source research environment for AI researchers to seamlessly train, evaluate, and scale models from local hardware to GPU clusters.
This platform helps AI researchers efficiently train, fine-tune, and evaluate large language models and diffusion models. You input datasets and model architectures, and it outputs trained models and performance analytics. It's designed for machine learning researchers working individually or in teams, managing complex experiments on various hardware.
4,820 stars. Actively maintained with 1,270 commits in the last 30 days.
Use this if you are an AI researcher who needs a unified environment to manage the full lifecycle of training, evaluating, and deploying foundation models, from a single machine to a GPU cluster.
Not ideal if you are looking for a simple tool for basic model inference without needing to train, fine-tune, or rigorously evaluate models.
Stars
4,820
Forks
501
Language
Python
License
AGPL-3.0
Category
Last pushed
Mar 12, 2026
Commits (30d)
1270
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/transformerlab/transformerlab-app"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Recent Releases
Related models
naru-project/naru
Neural Relation Understanding: neural cardinality estimators for tabular data
neurocard/neurocard
State-of-the-art neural cardinality estimators for join queries
danielzuegner/code-transformer
Implementation of the paper "Language-agnostic representation learning of source code from...
salesforce/CodeTF
CodeTF: One-stop Transformer Library for State-of-the-art Code LLM
salcc/QuantumTransformers
Quantum Transformers for High Energy Physics Analysis at the Large Hadron Collider