simhag/Compositional-Pre-Training-for-Semantic-Parsing-with-BERT
Implementation of Semantic Parsing with BERT and compositional pre-training on GeoQuery
This project helps convert natural language questions or commands into a structured, machine-readable format called a 'logical form.' You input plain English sentences, and it outputs a precise representation that computers can understand and act upon. It's for anyone building systems that need to interpret user language, like virtual assistants or smart search engines.
No commits in the last 6 months.
Use this if you need to translate user queries from natural language into a structured format for automated processing.
Not ideal if you're looking for a general-purpose natural language understanding tool that doesn't focus specifically on converting to logical forms.
Stars
11
Forks
5
Language
Scala
License
MIT
Category
Last pushed
Mar 20, 2019
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/simhag/Compositional-Pre-Training-for-Semantic-Parsing-with-BERT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
abachaa/MedQuAD
Medical Question Answering Dataset of 47,457 QA pairs created from 12 NIH websites
medmcqa/medmcqa
A large-scale (194k), Multiple-Choice Question Answering (MCQA) dataset designed to address...
shawnh2/QA-CivilAviationKG
基于民航业知识图谱的自动问答系统
YeYzheng/KGQA-Based-On-medicine
基于医药知识图谱的智能问答系统
chizhu/KGQA_HLM
基于知识图谱的《红楼梦》人物关系可视化及问答系统