Beyond Seen Data: Improving KBQA Generalization Through Schema-Guided Logical Form Generation

ACL ARR 2025 February Submission5265 Authors

16 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract:

Knowledge base question answering (KBQA) aims to answer user questions in natural language using rich human knowledge stored in large KBs. As current KBQA methods struggle with unseen knowledge base elements at test time, we introduce $\textbf{SG-KBQA}$: a novel model that injects schema contexts into entity retrieval and logical form generation to tackle this issue. It uses the richer semantics and awareness of the knowledge base structure provided by schema contexts to enhance generalizability. We show that SG-KBQA achieves strong generalizability, outperforming state-of-the-art models on two commonly used benchmark datasets across a variety of test settings. Our source code is available at https://anonymous.4open.science/r/SG-KBQA-7895.

Paper Type: Long
Research Area: Question Answering
Research Area Keywords: Knowledge Base Question Answering
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 5265
Loading