adapters and Multimodal-Adapters
Maintenance
13/25
Adoption
12/25
Maturity
25/25
Community
22/25
Maintenance
0/25
Adoption
4/25
Maturity
8/25
Community
13/25
Stars: 2,802
Forks: 375
Downloads: —
Commits (30d): 1
Language: Python
License: Apache-2.0
Stars: 7
Forks: 2
Downloads: —
Commits (30d): 0
Language: Jupyter Notebook
License: —
No risk flags
No License
Stale 6m
No Package
No Dependents
About adapters
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
This library helps machine learning engineers and researchers fine-tune large language models (LLMs) more efficiently. It takes a pre-trained Transformer model and task-specific datasets, then allows you to add and train small, specialized 'adapter' modules. The output is a highly optimized model for specific NLP tasks like text classification or question answering, without needing to retrain the entire large model.
natural-language-processing
machine-learning-engineering
deep-learning-research
model-optimization
transfer-learning
About Multimodal-Adapters
IsaacRodgz/Multimodal-Adapters
Adapter modules with support for multimodal fusion of information (text, video, audio, etc.) using pre-trained BERT base model
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work