adapter-hub/efficient-task-transfer

Research code for "What to Pre-Train on? Efficient Intermediate Task Selection", EMNLP 2021

33
/ 100
Emerging

This tool helps machine learning engineers and researchers efficiently train and evaluate Transformer and Adapter models for various Natural Language Understanding (NLU) tasks. It streamlines the process of experimenting with different pre-trained models and fine-tuning strategies. You provide task configurations and desired models, and the tool handles data preprocessing, training, evaluation, and optional notifications.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher regularly fine-tuning Transformer or Adapter models for NLU tasks and want to standardize and accelerate your experimentation workflow.

Not ideal if you are looking for a no-code solution or pre-built, production-ready NLU models without needing to manage the training process yourself.

natural-language-processing machine-learning-engineering model-fine-tuning deep-learning-research text-analytics
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

37

Forks

4

Language

Python

License

MIT

Last pushed

Dec 21, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/adapter-hub/efficient-task-transfer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.