milmor/GPT

Implementation of Generative Pretrained Transformer Model in Tensorflow / Keras

40
/ 100
Emerging

This is an implementation for developers who want to experiment with or build upon a Generative Pretrained Transformer (GPT) model using TensorFlow. You can feed it a dataset like OpenWebText to train a new language model, or use a pre-trained GPT-Mini to generate new text based on a given prompt. This is for machine learning engineers or researchers looking to understand and apply foundational large language models.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher wanting a clear, functional TensorFlow implementation of a GPT model for training or text generation experiments.

Not ideal if you are looking for a ready-to-use application or a high-level API for natural language processing without diving into model implementation details.

natural-language-processing machine-learning-research text-generation deep-learning language-model-development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

34

Forks

11

Language

Python

License

MIT

Last pushed

Jun 08, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/milmor/GPT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.