LoserCheems/WonderfulMatrices
Wonderful Matrices to Build Small Language Models
This project helps machine learning engineers and researchers build small language models (SLMs) with greater efficiency. It provides tools and architectures to process text data, then train and evaluate SLMs using novel techniques from the "Wonderful Matrices" paper, ultimately resulting in models with fewer cache states and increased knowledge capacity.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to experiment with cutting-edge language model architectures to create more efficient and powerful small language models.
Not ideal if you are looking for a ready-to-use large language model for immediate deployment without deep understanding or customization of its internal workings.
Stars
44
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 15, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/LoserCheems/WonderfulMatrices"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in...
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
pbloem/former
Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)
NVIDIA/FasterTransformer
Transformer related optimization, including BERT, GPT
kyegomez/SimplifiedTransformers
SimplifiedTransformer simplifies transformer block without affecting training. Skip connections,...