AviSoori1x/makeMoE

From scratch implementation of a sparse mixture of experts language model inspired by Andrej Karpathy's makemore :)

46
/ 100
Emerging

This project helps machine learning engineers understand and build a 'sparse mixture of experts' language model from the ground up. It takes a dataset of text (like Shakespeare's writings) as input and outputs a trained model capable of generating new text in a similar style. It's intended for engineers or researchers working with language models who want to explore advanced architectures.

793 stars. No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher who wants to learn the inner workings of sparse mixture of experts (MoE) architectures for language models through a clear, from-scratch implementation.

Not ideal if you are a practitioner looking for a pre-trained, high-performance language model for immediate use in applications.

language-modeling neural-networks deep-learning-research text-generation machine-learning-engineering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

793

Forks

92

Language

Jupyter Notebook

License

MIT

Last pushed

Oct 30, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/AviSoori1x/makeMoE"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.