louisbrulenaudet/mergeKit

Tools for merging pretrained Large Language Models and create Mixture of Experts (MoE) from open-source models.

22
/ 100
Experimental

MergeKit helps AI practitioners combine multiple pre-trained large language models (LLMs) into a single, more capable model, or create Mixture of Experts (MoE) models. You provide existing open-source LLMs, and it outputs a new, merged model along with a basic README file for sharing. This is for AI engineers and researchers who want to customize and enhance LLMs without vast computational resources.

No commits in the last 6 months.

Use this if you want to combine several open-source Large Language Models or create a Mixture of Experts model efficiently, even with limited GPU memory.

Not ideal if you are looking for a tool to train LLMs from scratch or fine-tune them on new datasets rather than merging existing ones.

Large Language Models AI Model Development Machine Learning Research Model Optimization Natural Language Processing
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

8

Forks

Language

Jupyter Notebook

License

Apache-2.0

Last pushed

Sep 18, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/louisbrulenaudet/mergeKit"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.