calpt/awesome-adapter-resources

Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning

35
/ 100
Emerging

This resource helps machine learning practitioners efficiently adapt large, pre-trained neural networks like Transformers for new tasks. It provides a curated collection of tools and research papers on 'adapter' methods, which allow you to customize these large models without the extensive computational cost of full fine-tuning. This is ideal for AI engineers, data scientists, and ML researchers who work with large language models, computer vision models, or audio processing models.

202 stars. No commits in the last 6 months.

Use this if you need to fine-tune large pre-trained models for various tasks with significantly reduced computational resources, storage, and deployment complexity compared to full model fine-tuning.

Not ideal if you are working with small models that don't benefit from parameter-efficient techniques or if you require full fine-tuning for maximum performance and have ample computational resources.

large-language-models natural-language-processing computer-vision audio-processing model-optimization
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 9 / 25

How are scores calculated?

Stars

202

Forks

9

Language

Python

License

ISC

Last pushed

May 04, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/calpt/awesome-adapter-resources"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.