thunlp/OpenDelta

A plug-and-play library for parameter-efficient-tuning (Delta Tuning)

52
/ 100
Established

When working with large pre-trained language models for specific natural language tasks, OpenDelta helps you adapt them efficiently. It allows you to fine-tune these models for your unique needs (like spelling correction or text generation) without having to retrain the entire massive model. The input is a pre-trained model and your specific task, and the output is a much smaller 'delta' model that customizes the original for your use case. This is ideal for machine learning engineers and researchers who want to customize large models without immense computational resources.

1,040 stars. No commits in the last 6 months. Available on PyPI.

Use this if you need to adapt large pre-trained language models for new tasks or datasets and want to save significant computational resources and storage space.

Not ideal if you are developing a pre-trained model from scratch or performing general-purpose training without a specific adaptation target.

natural-language-processing model-adaptation machine-learning-engineering deep-learning-research text-generation
Stale 6m
Maintenance 0 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 17 / 25

How are scores calculated?

Stars

1,040

Forks

84

Language

Python

License

Apache-2.0

Last pushed

Sep 19, 2024

Commits (30d)

0

Dependencies

12

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/thunlp/OpenDelta"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.