adapter-hub/efficient-task-transfer
Research code for "What to Pre-Train on? Efficient Intermediate Task Selection", EMNLP 2021
This tool helps machine learning engineers and researchers efficiently train and evaluate Transformer and Adapter models for various Natural Language Understanding (NLU) tasks. It streamlines the process of experimenting with different pre-trained models and fine-tuning strategies. You provide task configurations and desired models, and the tool handles data preprocessing, training, evaluation, and optional notifications.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher regularly fine-tuning Transformer or Adapter models for NLU tasks and want to standardize and accelerate your experimentation workflow.
Not ideal if you are looking for a no-code solution or pre-built, production-ready NLU models without needing to manage the training process yourself.
Stars
37
Forks
4
Language
Python
License
MIT
Category
Last pushed
Dec 21, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/adapter-hub/efficient-task-transfer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
gaussalgo/adaptor
ACL 2022: Adaptor: a library to easily adapt a language model to your own task, domain, or...
ylsung/VL_adapter
PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for Vision-and-Language...
intersun/LightningDOT
source code and pre-trained/fine-tuned checkpoint for NAACL 2021 paper LightningDOT
kyegomez/M2PT
Implementation of M2PT in PyTorch from the paper: "Multimodal Pathway: Improve Transformers with...