kamyarghajar/DistilledNeuralResponseRanker

Implementation of "Distilling Knowledge for Fast Retrieval-based Chat-bots" (SIGIR 2020) using deep matching transformer networks and knowledge distillation for response retrieval in information-seeking conversational systems.

26
/ 100
Experimental

This project helps developers create faster, more efficient information-seeking chatbots. It takes a dataset of conversations and uses it to train a system that can quickly identify the best response to a user's query. The primary users are developers building conversational AI systems who want to improve response retrieval speed.

No commits in the last 6 months.

Use this if you are a developer building a retrieval-based chatbot and need to significantly speed up how quickly your bot can find and provide relevant answers.

Not ideal if you are looking for a pre-built chatbot solution or a tool for generative AI chatbots.

chatbot-development conversational-ai response-retrieval natural-language-processing information-retrieval
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 13 / 25

How are scores calculated?

Stars

9

Forks

2

Language

Python

License

Last pushed

Jul 23, 2020

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kamyarghajar/DistilledNeuralResponseRanker"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.