sytelus/nanuGPT

Simple, reliable and well tested training code for quick experiments with transformer based models

37
/ 100
Emerging

This tool helps machine learning engineers and researchers quickly experiment with and train transformer-based language models. You provide raw text data or use pre-configured datasets like TinyShakespeare or OpenWebText, and it outputs a trained language model ready for text generation. It's designed for individuals working on developing and optimizing large language models, especially those with limited GPU resources.

Use this if you need a flexible and reproducible framework to rapidly prototype and train custom transformer models, even with a single GPU or CPU.

Not ideal if you are looking for a high-level API for readily deploying existing models or fine-tuning without understanding the training process.

language-model-training deep-learning-research text-generation transformer-models ml-experimentation
No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

13

Forks

1

Language

Python

License

MIT

Last pushed

Mar 13, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/sytelus/nanuGPT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.