Abstract: Knowledge base question answering (KBQA) aims to answer user questions in natural language using rich human knowledge stored in large KBs. As current KBQA methods struggle with unseen knowledge base elements and their novel compositions at test time, we introduce \textbf{SG-KBQA} --- a novel model that injects schema contexts into entity retrieval and logical form generation to tackle this issue. It exploits information about the semantics and structure of the knowledge base provided by schema contexts to enhance generalizability. We show that \model\ achieves strong generalizability, outperforming state-of-the-art models on two commonly used benchmark datasets across a variety of test settings. Our source code is available at \url{https://anonymous.4open.science/r/SG-KBQA-7895}.
Paper Type: Long
Research Area: Question Answering
Research Area Keywords: Knowledge Base Question Answering
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 7489
Loading