Whiax/BERT-Transformer-Pytorch

Basic implementation of BERT and Transformer in Pytorch in one short python file (also includes "predict next word" GPT task)

38
/ 100
Emerging

This project offers a foundational, easy-to-understand codebase for those curious about the inner workings of modern natural language processing models. It takes raw text data and trains a simplified BERT model to learn relationships between words, which can be visualized. It's designed for NLP beginners and individuals who want to grasp the core mechanics of Transformer models.

No commits in the last 6 months.

Use this if you are an aspiring data scientist or AI enthusiast who wants to learn the fundamental concepts behind text understanding models like BERT and Transformers through a hands-on, simplified code example.

Not ideal if you need a production-ready, highly optimized, or full-featured implementation of BERT or Transformer models for advanced NLP tasks.

natural-language-processing machine-learning-education text-analysis-fundamentals AI-learning deep-learning-basics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

45

Forks

7

Language

Python

License

MIT

Last pushed

Jan 09, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Whiax/BERT-Transformer-Pytorch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.