AristotelisPap/Question-Answering-with-BERT-and-Knowledge-Distillation
Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher model. Reduced the size of the original BERT by 40%.
This project helps developers create efficient question-answering systems. It takes text documents and questions as input, then identifies precise answers within the text or determines if no answer exists. This is ideal for machine learning engineers or NLP specialists building solutions that need to quickly understand and respond to user queries based on provided context.
No commits in the last 6 months.
Use this if you are a machine learning engineer building a question-answering system and need to optimize your model for faster inference and smaller size without significant performance loss.
Not ideal if you are looking for a ready-to-use, off-the-shelf question-answering application rather than a framework for building one.
Stars
26
Forks
4
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Feb 13, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/AristotelisPap/Question-Answering-with-BERT-and-Knowledge-Distillation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
cdqa-suite/cdQA
⛔ [NOT MAINTAINED] An End-To-End Closed Domain Question Answering System.
AMontgomerie/question_generator
An NLP system for generating reading comprehension questions
KristiyanVachev/Leaf-Question-Generation
Easy to use and understand multiple-choice question generation algorithm using T5 Transformers.
robinniesert/kaggle-google-quest
Google QUEST Q&A Labeling Kaggle Competition 6th Place Solution
cooelf/AwesomeMRC
IJCAI 2021 Tutorial & code for Retrospective Reader for Machine Reading Comprehension (AAAI 2021)