akanyaani/gpt-2-tensorflow2.0
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
This project helps machine learning engineers and researchers implement and pre-train OpenAI's GPT-2 model using TensorFlow 2.0. You provide a large corpus of text data, and the system outputs a trained language model capable of generating coherent text sequences. It's intended for those working with advanced natural language processing tasks who need a TensorFlow 2.0 implementation.
263 stars. No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher specifically looking for a TensorFlow 2.0 implementation to pre-train or generate text using the GPT-2 architecture.
Not ideal if you are looking for a pre-trained model out-of-the-box, a simple text generation API, or a solution in a different framework like PyTorch.
Stars
263
Forks
81
Language
Python
License
MIT
Category
Last pushed
Mar 25, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/akanyaani/gpt-2-tensorflow2.0"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Nixtla/nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and...
andrewdalpino/NoPE-GPT
A GPT-style small language model (SLM) with no positional embeddings (NoPE).
sigdelsanjog/gptmed
pip install gptmed
samkamau81/FinGPT_
FinGPT is an AI language model designed to understand and generate financial content. Built upon...
VinAIResearch/PhoGPT
PhoGPT: Generative Pre-training for Vietnamese (2023)