imcaspar/gpt2-ml

GPT2 for Multiple Languages, including pretrained models. GPT2 多语言支持, 15亿参数中文预训练模型

51
/ 100
Established

This project offers powerful AI models specifically trained to generate human-like text in multiple languages, with a strong focus on Chinese. You provide a prompt or starting text, and the model expands on it, generating coherent and contextually relevant continuations. This is ideal for content creators, marketers, or researchers working with large volumes of text in various languages.

1,703 stars. No commits in the last 6 months.

Use this if you need to automatically generate creative content, expand on existing text, or perform advanced natural language processing tasks, especially in Chinese.

Not ideal if you're looking for a simple, off-the-shelf translation tool or if your primary need is strictly rule-based text manipulation.

content-creation text-generation multilingual-publishing natural-language-processing chinese-language-ai
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

1,703

Forks

330

Language

Python

License

Apache-2.0

Last pushed

May 22, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/imcaspar/gpt2-ml"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.