alexandra-chron/hierarchical-domain-adaptation
Code of NAACL 2022 "Efficient Hierarchical Domain Adaptation for Pretrained Language Models" paper.
This tool helps machine learning engineers and researchers fine-tune large language models like GPT-2 more effectively when working with data from many different specialized domains. You provide various domain-specific text datasets, and the tool outputs a more efficient, adapted language model that performs better across these diverse domains, even on new, unseen ones. This is for users who manage and deploy large language models in specialized applications.
No commits in the last 6 months.
Use this if you need to adapt a large language model to perform well across multiple distinct text domains (e.g., legal, medical, financial) and want to do so more efficiently than traditional methods.
Not ideal if you are working with a single text domain or do not have a strong need for fine-tuning large language models across diverse, hierarchical datasets.
Stars
32
Forks
3
Language
Python
License
—
Category
Last pushed
Sep 26, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/alexandra-chron/hierarchical-domain-adaptation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
airaria/TextBrewer
A PyTorch-based knowledge distillation toolkit for natural language processing
sunyilgdx/NSP-BERT
The code for our paper "NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original...
princeton-nlp/CoFiPruning
[ACL 2022] Structured Pruning Learns Compact and Accurate Models https://arxiv.org/abs/2204.00408
kssteven418/LTP
[KDD'22] Learned Token Pruning for Transformers
georgian-io/Transformers-Domain-Adaptation
:no_entry: [DEPRECATED] Adapt Transformer-based language models to new text domains