SuperBruceJia/Awesome-Mixture-of-Experts
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
This resource is a curated collection of materials for anyone interested in understanding or implementing Mixture of Experts (MoE) architectures in machine learning. It provides a comprehensive overview of research papers, courses, presentations, and projects related to MoE and Mixture of Multimodal Experts (MoME). Researchers, students, and practitioners in AI and machine learning fields will find this useful for exploring foundational concepts and recent advancements in scaling neural networks.
No commits in the last 6 months.
Use this if you are a machine learning researcher, student, or practitioner looking for a structured collection of resources on Mixture of Experts models and their applications.
Not ideal if you are looking for an immediate, ready-to-use code library or a step-by-step tutorial for a specific implementation task.
Stars
57
Forks
7
Language
—
License
MIT
Category
Last pushed
Oct 06, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/SuperBruceJia/Awesome-Mixture-of-Experts"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
InternLM/xtuner
A Next-Generation Training Engine Built for Ultra-Large MoE Models
AmanPriyanshu/GPT-OSS-MoE-ExpertFingerprinting
ExpertFingerprinting: Behavioral Pattern Analysis and Specialization Mapping of Experts in...
arm-education/Advanced-AI-Mixture-of-Experts
Hands-on course materials for ML engineers to implement and optimize Mixture of Experts models:...
sumitdotml/moe-emergence
a project highlighting the emergent expert specialization in Mixture of Experts (MoEs) across 3...
iahuang/cosmoe
Enabling inference of large mixture-of-experts (MoE) models on Apple Silicon using dynamic offloading.