zjunlp/KnowPrompt
[WWW 2022] KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction
This project helps researchers and data scientists automatically extract relationships between entities from text. You feed it a collection of documents (like news articles, scientific papers, or dialogues) and it identifies connections like "person works for company" or "drug treats disease," even with limited training examples. It's designed for natural language processing practitioners working on information extraction tasks.
207 stars. No commits in the last 6 months.
Use this if you need to identify specific relationships between entities in large text datasets and want to improve accuracy, especially when you have only a small amount of labeled data for training.
Not ideal if your primary goal is general text classification or summarization, rather than the specific task of extracting defined relationships between entities.
Stars
207
Forks
35
Language
Python
License
MIT
Category
Last pushed
Jun 20, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/zjunlp/KnowPrompt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
THUDM/P-tuning-v2
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
ucinlp/autoprompt
AutoPrompt: Automatic Prompt Construction for Masked Language Models.
zjunlp/PromptKG
PromptKG Family: a Gallery of Prompt Learning & KG-related research works, toolkits, and paper-list.
princeton-nlp/OptiPrompt
[NAACL 2021] Factual Probing Is [MASK]: Learning vs. Learning to Recall https://arxiv.org/abs/2104.05240
VE-FORBRYDERNE/mtj-softtuner
Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab...