tommasomncttn/mergenetic
Flexible library for merging large language models (LLMs) via evolutionary optimization (ACL 2025 Demo).
Mergenetic helps you combine multiple large language models (LLMs) to create a new, more performant one, even if you have limited computing power. You provide existing LLMs, and it uses smart techniques to find the best way to merge their knowledge. The output is an optimized, merged LLM that can perform better on specific tasks, ideal for an AI researcher or machine learning engineer experimenting with model optimization.
100 stars. No commits in the last 6 months. Available on PyPI.
Use this if you want to create a specialized large language model by intelligently combining existing models without needing vast computational resources.
Not ideal if you are looking to train a large language model from scratch or fine-tune a single model for a specific task.
Stars
100
Forks
5
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Aug 08, 2025
Commits (30d)
0
Dependencies
20
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/tommasomncttn/mergenetic"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ModelTC/LightCompress
[EMNLP 2024 & AAAI 2026] A powerful toolkit for compressing large models including LLMs, VLMs,...
p-e-w/heretic
Fully automatic censorship removal for language models
Orion-zhen/abliteration
Make abliterated models with transformers, easy and fast
YerbaPage/LongCodeZip
LongCodeZip: Compress Long Context for Code Language Models [ASE2025]
locuslab/wanda
A simple and effective LLM pruning approach.