Andras7/gpt2-pytorch

Extremely simple and understandable GPT2 implementation with minor tweaks

25
/ 100
Experimental

This is a tool for machine learning practitioners and researchers who want to build custom text generation models. It allows you to feed in your own text data and train a model that can then generate new text sequences, even for languages beyond English. This is ideal for those who need to experiment with advanced language model architectures.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher looking for a straightforward and adaptable framework to train GPT-2 style language models, especially with custom tokenizers or specific optimization needs.

Not ideal if you are looking for an out-of-the-box text generation solution without needing to train or customize a model.

natural-language-processing text-generation deep-learning-research custom-language-models AI-experimentation
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 11 / 25

How are scores calculated?

Stars

21

Forks

3

Language

Python

License

Last pushed

Dec 06, 2019

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Andras7/gpt2-pytorch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.