BubbleJoe-BrownU/TransformerHub

This is a repository of transformer-like models, including Transformer, GPT, BERT, ViT and much more to be implemented along my journey into the fascinating deep learning field.

34
/ 100
Emerging

This is a collection of fundamental transformer model architectures like GPT, BERT, and ViT, showcasing how they are built and designed. It provides a structured way to understand the internal workings of these models, from their attention mechanisms to positional embeddings. It is primarily for deep learning practitioners, students, and researchers who want to learn the underlying code implementations.

No commits in the last 6 months.

Use this if you are a deep learning student or researcher looking for reference implementations of core transformer models to understand their architecture and advanced programming techniques.

Not ideal if you need state-of-the-art models for immediate application in real-world tasks, as these are primarily for educational and reference purposes.

deep-learning machine-learning-engineering natural-language-processing-research computer-vision-research neural-network-architecture
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 8 / 25
Community 17 / 25

How are scores calculated?

Stars

87

Forks

16

Language

Python

License

Last pushed

Mar 12, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/BubbleJoe-BrownU/TransformerHub"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.