zjunlp/PromptKG
PromptKG Family: a Gallery of Prompt Learning & KG-related research works, toolkits, and paper-list.
This project is a curated collection of research and tools for anyone working with prompt learning and knowledge graphs, especially those using Pre-trained Language Models (PLMs). It provides implementations of research models and libraries for creating and modifying PLM-based knowledge graph embeddings. Researchers and practitioners in natural language processing and AI development who want to enhance language models with structured knowledge will find this useful.
734 stars. No commits in the last 6 months.
Use this if you are a researcher or advanced practitioner looking for state-of-the-art methods and tools to integrate knowledge graphs with prompt-based language models for tasks like text classification, relation extraction, or question answering.
Not ideal if you are new to prompt engineering or knowledge graphs and are looking for a simple, off-the-shelf solution without diving into research papers and model implementations.
Stars
734
Forks
77
Language
Python
License
MIT
Category
Last pushed
Mar 22, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/zjunlp/PromptKG"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
THUDM/P-tuning-v2
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
ucinlp/autoprompt
AutoPrompt: Automatic Prompt Construction for Masked Language Models.
zjunlp/KnowPrompt
[WWW 2022] KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation...
princeton-nlp/OptiPrompt
[NAACL 2021] Factual Probing Is [MASK]: Learning vs. Learning to Recall https://arxiv.org/abs/2104.05240
VE-FORBRYDERNE/mtj-softtuner
Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab...