Semantic Parsing with Candidate Expressions for Knowledge Base Question Answering

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: neurosymbolic & hybrid AI systems (physics-informed, logic & formal reasoning, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: semantic parsing, constrained decoding, sequence-to-sequence, natural language processing
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: Constrained decoding for sequence-to-sequence semantic parsing
Abstract: Semantic parsers convert natural language to logical forms, which can then be evaluated on knowledge bases (KBs) to produce denotations. Early neural semantic parsers used grammars that define actions, such as production rules, then the semantic parsers could sequentially take actions to construct well-typed logical forms. In contrast, recent neural semantic parsers have been developed with pre-trained sequence-to-sequence (seq2seq) models, such as BART and T5, which treat logical forms as sequences of tokens. However, the seq2seq models have difficulty in learning to generate logical forms that contain components drawn from large KBs. In this work, we propose a grammar augmented with candidate expressions for seq2seq semantic parsing on large KBs. The grammar defines actions as production rules, and our semantic parser predicts actions during inference under the constraints by types and candidate expressions. We apply the grammar to knowledge base question answering, where the constraints by candidate expressions assist a semantic parser to generate valid KB components. Experiments on the KQA Pro benchmark showed that the constraints by candidate expressions increased the accuracy of our semantic parser, and our semantic parser achieved state-of-the-art performance on KQA Pro.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2430
Loading