10-OASIS-01/Autoregressive-Language-Model
This project is a comprehensive implementation of a Transformer-based language model. It encompasses the full pipeline of natural language modeling, including data preprocessing, model training, evaluation, and inference.
This project provides a clear, hands-on guide for anyone learning about deep learning and natural language processing. It helps you build, train, and evaluate your own Transformer-based language model from scratch. You input raw text data, and the project walks you through creating a tokenizer, training the model, and then generating new text or assessing its performance.
No commits in the last 6 months.
Use this if you are a beginner in deep learning or NLP and want to understand the inner workings of a Transformer language model, including manual tokenization and model training, using a single GPU or CPU.
Not ideal if you need a production-ready, highly optimized, or multi-GPU distributed training solution, as this project prioritizes transparency and learning over advanced performance.
Stars
9
Forks
—
Language
Python
License
MIT
Category
Last pushed
Dec 06, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/10-OASIS-01/Autoregressive-Language-Model"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...
mosaicml/llm-foundry
LLM training code for Databricks foundation models
rickiepark/llm-from-scratch
<밑바닥부터 만들면서 공부하는 LLM>(길벗, 2025)의 코드 저장소