Knowledge Distillation Frameworks Prompt Engineering Tools

There are 8 knowledge distillation frameworks tools tracked. The highest-rated is google-research/prompt-tuning at 43/100 with 697 stars.

Get all 8 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=prompt-engineering&subcategory=knowledge-distillation-frameworks&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Tool Score Tier
1 google-research/prompt-tuning

Original Implementation of Prompt Tuning from Lester, et al, 2021

43
Emerging
2 thunlp/PromptPapers

Must-read papers on prompt-based tuning for pre-trained language models.

38
Emerging
3 ZhangYuanhan-AI/NOAH

[TPAMI] Searching prompt modules for parameter-efficient transfer learning.

37
Emerging
4 gmkim-ai/PromptKD

An official implementation of "PromptKD: Distilling Student-Friendly...

35
Emerging
5 zhengzangw/DoPrompt

Official implementation of PCS in essay "Prompt Vision Transformer for...

33
Emerging
6 Hzfinfdu/MPMP

ACL'2023: Multi-Task Pre-Training of Modular Prompt for Few-Shot Learning

30
Emerging
7 youngjae-cho/APP

Official PyTorch implementation for Make Prompts Adaptable: Bayesian...

27
Experimental
8 machengcheng2016/Subspace-Prompt-Learning

Official code for "Understanding and Mitigating Overfitting in Prompt Tuning...

15
Experimental