kakaobrain/kogpt
KakaoBrain KoGPT (Korean Generative Pre-trained Transformer)
KoGPT helps you generate Korean text, complete sentences, or answer questions by taking a Korean text prompt and producing coherent Korean text. This is designed for researchers, content creators, or anyone needing to generate or understand Korean language at scale.
1,014 stars. No commits in the last 6 months.
Use this if you need to generate high-quality, human-like Korean text for various applications.
Not ideal if your primary need is for non-Korean languages or if you require fine-tuned control over content to avoid potentially socially unacceptable outputs.
Stars
1,014
Forks
138
Language
Python
License
—
Category
Last pushed
Jan 30, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kakaobrain/kogpt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
tabularis-ai/be_great
A novel approach for synthesizing tabular data using pretrained large language models
EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron...
shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet...
ai-forever/ru-gpts
Russian GPT3 models.
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold...