cmu-flame/FLAME-MoE

Official repository for FLAME-MoE: A Transparent End-to-End Research Platform for Mixture-of-Experts Language Models

32
/ 100
Emerging

This platform helps AI researchers develop and test Mixture-of-Experts (MoE) language models. It takes in raw textual data and configuration settings to produce trained MoE models and evaluation metrics. AI researchers and machine learning engineers focused on advanced language model architectures would use this.

No commits in the last 6 months.

Use this if you are an AI researcher building, training, and evaluating Mixture-of-Experts language models and need a robust, transparent framework for your experiments.

Not ideal if you are looking to simply use a pre-trained language model or fine-tune an existing model without delving into MoE architecture research.

AI Research Large Language Models Mixture-of-Experts Machine Learning Engineering Natural Language Processing
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 7 / 25
Maturity 7 / 25
Community 16 / 25

How are scores calculated?

Stars

33

Forks

7

Language

Jupyter Notebook

License

Last pushed

Sep 19, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/cmu-flame/FLAME-MoE"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.