akanyaani/gpt-2-tensorflow2.0

OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0

49
/ 100
Emerging

This project helps machine learning engineers and researchers implement and pre-train OpenAI's GPT-2 model using TensorFlow 2.0. You provide a large corpus of text data, and the system outputs a trained language model capable of generating coherent text sequences. It's intended for those working with advanced natural language processing tasks who need a TensorFlow 2.0 implementation.

263 stars. No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher specifically looking for a TensorFlow 2.0 implementation to pre-train or generate text using the GPT-2 architecture.

Not ideal if you are looking for a pre-trained model out-of-the-box, a simple text generation API, or a solution in a different framework like PyTorch.

natural-language-processing machine-learning-engineering deep-learning-research text-generation language-model-training
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 23 / 25

How are scores calculated?

Stars

263

Forks

81

Language

Python

License

MIT

Last pushed

Mar 25, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/akanyaani/gpt-2-tensorflow2.0"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.