affjljoo3581/GPT2
PyTorch Implementation of OpenAI GPT-2
This project helps developers implement the GPT-2 natural language processing model. You provide a large text dataset and the project outputs a trained GPT-2 model that can generate human-like text. This is for software developers and machine learning engineers who need to integrate or experiment with GPT-2.
357 stars. No commits in the last 6 months.
Use this if you are a developer looking for a PyTorch-based, understandable, and optimized implementation of the GPT-2 model for text generation or training.
Not ideal if you are an end-user looking for a ready-to-use application to generate text without any coding or model training.
Stars
357
Forks
67
Language
Python
License
Apache-2.0
Category
Last pushed
Jul 04, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/affjljoo3581/GPT2"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
gyunggyung/LFM2-KoEn-Tuning
Fine-tuning LFM2-1.2B for Korean-English bidirectional translation. GRPO+COMET & SFT Training,...
akanyaani/Illustrated_GPT2_With_Code
Explained GPT-2 Transformer model step by step with code.
b14ucky/Taco-LLMingway
Custom GPT Transformer architecture built from scratch in PyTorch. Trained on Taco Hemingway's...
dheeren-tejani/mini-lm-124m
Experimental GPT-2 scale (~124M param) LLM trained from scratch on Google Colab. Trained on C4,...