atulkum/co-attention
Pytorch implementation of "Dynamic Coattention Networks For Question Answering"
This project helps you understand how a machine learning model answers questions based on a given text. You provide a document and a question, and it shows you which parts of the document the model focused on to generate its answer. This is useful for researchers and practitioners working on natural language understanding and question-answering systems.
No commits in the last 6 months.
Use this if you need to visualize and analyze the attention mechanisms of a co-attention model applied to question answering.
Not ideal if you are looking for a ready-to-use question-answering application for end-users, rather than a tool for model analysis.
Stars
62
Forks
14
Language
Python
License
—
Category
Last pushed
Oct 21, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/atulkum/co-attention"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
localminimum/QANet
A Tensorflow implementation of QANet for machine reading comprehension
carpedm20/MemN2N-tensorflow
"End-To-End Memory Networks" in Tensorflow
HKUST-KnowComp/R-Net
Tensorflow Implementation of R-Net
domluna/memn2n
End-To-End Memory Network using Tensorflow
allenai/bi-att-flow
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that...