ucinlp/autoprompt
AutoPrompt: Automatic Prompt Construction for Masked Language Models.
This project helps NLP practitioners automatically craft the best prompts for large language models to perform specific tasks. It takes raw text data (like customer reviews or factual statements) and outputs highly effective prompt templates. This is useful for data scientists and NLP researchers who want to quickly evaluate a model's inherent ability to handle tasks like sentiment analysis, natural language inference, or fact extraction without extensive fine-tuning.
641 stars. No commits in the last 6 months.
Use this if you need to rapidly discover how well a masked language model can perform various NLP tasks by generating optimal prompts for zero-shot or few-shot learning.
Not ideal if you are looking for a tool to fine-tune language models with labeled data or build custom models from scratch.
Stars
641
Forks
86
Language
Python
License
Apache-2.0
Category
Last pushed
Aug 24, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/ucinlp/autoprompt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
THUDM/P-tuning-v2
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
zjunlp/KnowPrompt
[WWW 2022] KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation...
zjunlp/PromptKG
PromptKG Family: a Gallery of Prompt Learning & KG-related research works, toolkits, and paper-list.
princeton-nlp/OptiPrompt
[NAACL 2021] Factual Probing Is [MASK]: Learning vs. Learning to Recall https://arxiv.org/abs/2104.05240
VE-FORBRYDERNE/mtj-softtuner
Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab...