fattorib/Little-GPT

GPT* - Training faster small transformers using ALiBi, Parallel Residual Connections and more!

22
/ 100
Experimental

This project offers pre-trained language models that perform text generation and understanding tasks more quickly and efficiently. You can use these models by providing text as input, and they will produce human-like text or insights. It's designed for machine learning engineers and researchers who need to develop or experiment with transformer-based language models.

No commits in the last 6 months.

Use this if you are a machine learning practitioner looking for transformer models that train and run faster than standard GPT-2 variants.

Not ideal if you need a plug-and-play solution for non-technical users, as it requires some familiarity with model deployment.

natural-language-processing machine-learning-engineering text-generation transformer-models model-optimization
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

21

Forks

Language

Python

License

MIT

Last pushed

Oct 29, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/fattorib/Little-GPT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.