thunlp/OpenDelta
A plug-and-play library for parameter-efficient-tuning (Delta Tuning)
When working with large pre-trained language models for specific natural language tasks, OpenDelta helps you adapt them efficiently. It allows you to fine-tune these models for your unique needs (like spelling correction or text generation) without having to retrain the entire massive model. The input is a pre-trained model and your specific task, and the output is a much smaller 'delta' model that customizes the original for your use case. This is ideal for machine learning engineers and researchers who want to customize large models without immense computational resources.
1,040 stars. No commits in the last 6 months. Available on PyPI.
Use this if you need to adapt large pre-trained language models for new tasks or datasets and want to save significant computational resources and storage space.
Not ideal if you are developing a pre-trained model from scratch or performing general-purpose training without a specific adaptation target.
Stars
1,040
Forks
84
Language
Python
License
Apache-2.0
Category
Last pushed
Sep 19, 2024
Commits (30d)
0
Dependencies
12
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/thunlp/OpenDelta"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
KernelTuner/kernel_tuner
Kernel Tuner
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning