declare-lab/della

DELLA-Merging: Reducing Interference in Model Merging through Magnitude-Based Sampling

23
/ 100
Experimental

This tool helps AI engineers combine several specialized large language models (LLMs) into a single, more versatile model without needing extensive new training. You input your existing, fine-tuned LLMs that excel in specific areas (like math, coding, or general instructions), and it outputs a new, merged LLM capable of handling multiple tasks effectively. This is for machine learning practitioners or researchers who manage and deploy LLMs.

No commits in the last 6 months.

Use this if you need to consolidate multiple domain-specific large language models into one efficient model to reduce deployment costs or improve multi-task performance.

Not ideal if you're looking for a tool to train a large language model from scratch or to fine-tune a model on a completely new dataset.

large-language-models model-optimization multi-task-learning AI-model-deployment machine-learning-engineering
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 8 / 25

How are scores calculated?

Stars

36

Forks

3

Language

Python

License

Last pushed

Jul 12, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/declare-lab/della"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.