dpressel/mint

MinT: Minimal Transformer Library and Tutorials

29
/ 100
Experimental

This is a hands-on toolkit for machine learning engineers and researchers to build core Transformer models from the ground up. It provides clear tutorials and a minimal Python library to understand how models like BERT, GPT, and T5 work internally. You can start with raw text data, train these models, and then use them for tasks like text completion or classification.

261 stars. No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher who wants to deeply understand and implement Transformer architectures for natural language processing from foundational principles.

Not ideal if you are looking for a high-level API to quickly apply pre-trained Transformer models without needing to build them yourself.

natural-language-processing deep-learning-engineering transformer-models text-generation text-classification
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 11 / 25

How are scores calculated?

Stars

261

Forks

15

Language

Python

License

Last pushed

Jul 26, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/dpressel/mint"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.