zjunlp/knowledge-rumination

[EMNLP 2023] Knowledge Rumination for Pre-trained Language Models

27
/ 100
Experimental

This project helps natural language processing researchers improve how pre-trained language models answer complex, multi-turn questions. You provide a language model and a dataset of conversational questions, and the project outputs an enhanced model better at recalling and applying relevant information across an ongoing conversation. This is ideal for NLP scientists or machine learning engineers working on advanced question-answering systems.

No commits in the last 6 months.

Use this if you are developing conversational AI or question-answering systems and need to enhance your language model's ability to maintain context and recall information over multiple turns.

Not ideal if you are looking for a simple, off-the-shelf solution for single-turn question answering or if you are not comfortable working with command-line tools for model training.

conversational-ai natural-language-processing question-answering machine-learning-research model-fine-tuning
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 5 / 25

How are scores calculated?

Stars

17

Forks

1

Language

Python

License

MIT

Last pushed

Jun 29, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/zjunlp/knowledge-rumination"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.