Whiax/BERT-Transformer-Pytorch
Basic implementation of BERT and Transformer in Pytorch in one short python file (also includes "predict next word" GPT task)
This project offers a foundational, easy-to-understand codebase for those curious about the inner workings of modern natural language processing models. It takes raw text data and trains a simplified BERT model to learn relationships between words, which can be visualized. It's designed for NLP beginners and individuals who want to grasp the core mechanics of Transformer models.
No commits in the last 6 months.
Use this if you are an aspiring data scientist or AI enthusiast who wants to learn the fundamental concepts behind text understanding models like BERT and Transformers through a hands-on, simplified code example.
Not ideal if you need a production-ready, highly optimized, or full-featured implementation of BERT or Transformer models for advanced NLP tasks.
Stars
45
Forks
7
Language
Python
License
MIT
Category
Last pushed
Jan 09, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Whiax/BERT-Transformer-Pytorch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in...
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
pbloem/former
Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)
NVIDIA/FasterTransformer
Transformer related optimization, including BERT, GPT
kyegomez/SimplifiedTransformers
SimplifiedTransformer simplifies transformer block without affecting training. Skip connections,...