adapters and adaptor
These are competitors offering overlapping approaches to parameter-efficient adaptation in language models, though adapter-hub/adapters is a more mature and widely-adopted unified framework while adaptor targets task-specific fine-tuning with custom objectives.
About adapters
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
This library helps machine learning engineers and researchers fine-tune large language models (LLMs) more efficiently. It takes a pre-trained Transformer model and task-specific datasets, then allows you to add and train small, specialized 'adapter' modules. The output is a highly optimized model for specific NLP tasks like text classification or question answering, without needing to retrain the entire large model.
About adaptor
gaussalgo/adaptor
ACL 2022: Adaptor: a library to easily adapt a language model to your own task, domain, or custom objective(s).
This tool helps language model users improve their models' understanding and performance on specific types of text or tasks. It takes an existing language model and your specialized text data, then trains the model to better handle your unique domain (e.g., medical jargon, legal documents) or task (e.g., sentiment analysis, entity recognition). This results in a more accurate and robust language model tailored to your specific needs, ideal for data scientists or NLP practitioners.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work