AristotelisPap/Question-Answering-with-BERT-and-Knowledge-Distillation

Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher model. Reduced the size of the original BERT by 40%.

36
/ 100
Emerging

This project helps developers create efficient question-answering systems. It takes text documents and questions as input, then identifies precise answers within the text or determines if no answer exists. This is ideal for machine learning engineers or NLP specialists building solutions that need to quickly understand and respond to user queries based on provided context.

No commits in the last 6 months.

Use this if you are a machine learning engineer building a question-answering system and need to optimize your model for faster inference and smaller size without significant performance loss.

Not ideal if you are looking for a ready-to-use, off-the-shelf question-answering application rather than a framework for building one.

natural-language-processing machine-learning-engineering information-retrieval text-analytics AI-development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

26

Forks

4

Language

Jupyter Notebook

License

MIT

Last pushed

Feb 13, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/AristotelisPap/Question-Answering-with-BERT-and-Knowledge-Distillation"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.