SuperBruceJia/Awesome-Mixture-of-Experts

Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)

38
/ 100
Emerging

This resource is a curated collection of materials for anyone interested in understanding or implementing Mixture of Experts (MoE) architectures in machine learning. It provides a comprehensive overview of research papers, courses, presentations, and projects related to MoE and Mixture of Multimodal Experts (MoME). Researchers, students, and practitioners in AI and machine learning fields will find this useful for exploring foundational concepts and recent advancements in scaling neural networks.

No commits in the last 6 months.

Use this if you are a machine learning researcher, student, or practitioner looking for a structured collection of resources on Mixture of Experts models and their applications.

Not ideal if you are looking for an immediate, ready-to-use code library or a step-by-step tutorial for a specific implementation task.

AI research machine learning neural networks large language models multimodal AI
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 12 / 25

How are scores calculated?

Stars

57

Forks

7

Language

License

MIT

Last pushed

Oct 06, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/SuperBruceJia/Awesome-Mixture-of-Experts"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.