Andras7/gpt2-pytorch
Extremely simple and understandable GPT2 implementation with minor tweaks
This is a tool for machine learning practitioners and researchers who want to build custom text generation models. It allows you to feed in your own text data and train a model that can then generate new text sequences, even for languages beyond English. This is ideal for those who need to experiment with advanced language model architectures.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking for a straightforward and adaptable framework to train GPT-2 style language models, especially with custom tokenizers or specific optimization needs.
Not ideal if you are looking for an out-of-the-box text generation solution without needing to train or customize a model.
Stars
21
Forks
3
Language
Python
License
—
Category
Last pushed
Dec 06, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Andras7/gpt2-pytorch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
tabularis-ai/be_great
A novel approach for synthesizing tabular data using pretrained large language models
EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron...
shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet...
ai-forever/ru-gpts
Russian GPT3 models.
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold...