ai-forever/mgpt
Multilingual Generative Pretrained Model
This project offers a powerful language model capable of understanding and generating text across 61 different languages. It takes in a piece of text (a prompt) and can expand on it, translate it, or answer questions based on the input, producing new text in return. Anyone who works with text in multiple languages, such as content creators, researchers, or localization specialists, would find this useful.
207 stars. No commits in the last 6 months.
Use this if you need to generate, summarize, or understand text in a wide array of languages, especially those less commonly supported by other models.
Not ideal if your primary need is for a model exclusively focused on a single, high-resource language, as more specialized models might offer superior performance.
Stars
207
Forks
23
Language
Jupyter Notebook
License
—
Category
Last pushed
May 13, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ai-forever/mgpt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
tabularis-ai/be_great
A novel approach for synthesizing tabular data using pretrained large language models
EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron...
shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet...
ai-forever/ru-gpts
Russian GPT3 models.
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold...