adapters and awesome-adapter-resources

The first is a production-ready PyTorch library implementing multiple parameter-efficient adapter architectures (LoRA, prefix tuning, etc.), while the second is a curated reference collection documenting adapter methods and research—making them complementary resources where practitioners use the library while researchers consult the overview.

adapters
72
Verified
Maintenance 13/25
Adoption 12/25
Maturity 25/25
Community 22/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 9/25
Stars: 2,802
Forks: 375
Downloads:
Commits (30d): 1
Language: Python
License: Apache-2.0
Stars: 202
Forks: 9
Downloads:
Commits (30d): 0
Language: Python
License: ISC
No risk flags
Stale 6m No Package No Dependents

About adapters

adapter-hub/adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

This library helps machine learning engineers and researchers fine-tune large language models (LLMs) more efficiently. It takes a pre-trained Transformer model and task-specific datasets, then allows you to add and train small, specialized 'adapter' modules. The output is a highly optimized model for specific NLP tasks like text classification or question answering, without needing to retrain the entire large model.

natural-language-processing machine-learning-engineering deep-learning-research model-optimization transfer-learning

About awesome-adapter-resources

calpt/awesome-adapter-resources

Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning

This resource helps machine learning practitioners efficiently adapt large, pre-trained neural networks like Transformers for new tasks. It provides a curated collection of tools and research papers on 'adapter' methods, which allow you to customize these large models without the extensive computational cost of full fine-tuning. This is ideal for AI engineers, data scientists, and ML researchers who work with large language models, computer vision models, or audio processing models.

large-language-models natural-language-processing computer-vision audio-processing model-optimization

Scores updated daily from GitHub, PyPI, and npm data. How scores work