Hzfinfdu/MPMP
ACL'2023: Multi-Task Pre-Training of Modular Prompt for Few-Shot Learning
This helps AI/ML practitioners quickly adapt pre-trained language models for new text classification tasks, even with very little specific training data. It takes your limited dataset for a new classification problem and fine-tunes a model, outputting a highly accurate text classifier ready for deployment. Data scientists and machine learning engineers working with natural language processing would use this.
No commits in the last 6 months.
Use this if you need to build text classification models efficiently for various tasks, especially when you only have a small amount of labeled data for each new task.
Not ideal if you are working with large, well-resourced datasets for a single task where traditional fine-tuning methods are sufficient.
Stars
40
Forks
7
Language
Python
License
—
Category
Last pushed
Oct 24, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/Hzfinfdu/MPMP"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
google-research/prompt-tuning
Original Implementation of Prompt Tuning from Lester, et al, 2021
thunlp/PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
ZhangYuanhan-AI/NOAH
[TPAMI] Searching prompt modules for parameter-efficient transfer learning.
gmkim-ai/PromptKD
An official implementation of "PromptKD: Distilling Student-Friendly Knowledge for Generative...
zhengzangw/DoPrompt
Official implementation of PCS in essay "Prompt Vision Transformer for Domain Generalization"