gmontamat/poor-mans-transformers

Implement Transformers (and Deep Learning) from scratch in NumPy

30
/ 100
Emerging

This project offers a hands-on way to understand how advanced deep learning models, specifically Transformers, are built from the ground up. It lets you create the core components and train a model for tasks like identifying duplicate questions or named entity recognition, using only basic numerical computing tools. This is ideal for deep learning students, researchers, or educators who want to delve into the foundational mechanics of neural networks without relying on high-level frameworks.

No commits in the last 6 months.

Use this if you are a deep learning student or educator who wants to learn the inner workings of Transformer models by implementing them from scratch using only NumPy.

Not ideal if you need to quickly build and deploy high-performance deep learning models for production, as this is a learning tool focused on clarity over efficiency.

deep-learning-education natural-language-processing neural-network-architecture machine-learning-fundamentals
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

28

Forks

2

Language

Python

License

MIT

Last pushed

Oct 03, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/gmontamat/poor-mans-transformers"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.