one-some/lazy-transformers-merge

Merge transformers without using like a bajillion GB of RAM

21
/ 100
Experimental

This tool helps you combine several large language models into a single, new model without needing a powerful computer with vast amounts of memory. You input a list of existing models, each with a specific weight indicating its contribution, and it outputs a new, merged model. It's designed for machine learning practitioners and researchers who want to experiment with model merging on standard hardware.

No commits in the last 6 months.

Use this if you need to merge multiple large transformer models and are limited by your computer's RAM capacity.

Not ideal if you need to merge tokenizers alongside your models or require more complex merging algorithms than a weighted average.

large-language-models model-merging machine-learning-research model-experimentation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

10

Forks

Language

Python

License

AGPL-3.0

Last pushed

Aug 05, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/one-some/lazy-transformers-merge"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.