antonio-f/mixture-of-experts-from-scratch
Mixture of Experts from scratch
This project offers a straightforward implementation of the Mixture of Experts technique. It helps you understand and apply this method for tasks that benefit from combining multiple specialized models. Anyone looking to explore or learn about Mixture of Experts without complex setups would find this useful.
No commits in the last 6 months.
Use this if you are a machine learning student, researcher, or practitioner wanting to learn the fundamental concepts of Mixture of Experts through a simple, hands-on example.
Not ideal if you need a production-ready, highly optimized, or feature-rich Mixture of Experts system for complex real-world applications.
Stars
13
Forks
1
Language
Jupyter Notebook
License
—
Category
Last pushed
Apr 12, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/antonio-f/mixture-of-experts-from-scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AdaptiveMotorControlLab/CEBRA
Learnable latent embeddings for joint behavioral and neural analysis - Official implementation of CEBRA
theolepage/sslsv
Toolkit for training and evaluating Self-Supervised Learning (SSL) frameworks for Speaker...
PaddlePaddle/PASSL
PASSL包含 SimCLR,MoCo v1/v2,BYOL,CLIP,PixPro,simsiam, SwAV, BEiT,MAE 等图像自监督算法以及 Vision...
YGZWQZD/LAMDA-SSL
30 Semi-Supervised Learning Algorithms
ModSSC/ModSSC
ModSSC: A Modular Framework for Semi Supervised Classification