itsShnik/adaptively-finetuning-transformers
Adaptively fine tuning transformer based models for multiple domains and multiple tasks
No commits in the last 6 months.
Stars
6
Forks
2
Language
Python
License
—
Category
Last pushed
May 22, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/itsShnik/adaptively-finetuning-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
gaussalgo/adaptor
ACL 2022: Adaptor: a library to easily adapt a language model to your own task, domain, or...
ylsung/VL_adapter
PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for Vision-and-Language...
intersun/LightningDOT
source code and pre-trained/fine-tuned checkpoint for NAACL 2021 paper LightningDOT
kyegomez/M2PT
Implementation of M2PT in PyTorch from the paper: "Multimodal Pathway: Improve Transformers with...