Context Transformer with Stacked Pointer Networks for Conversational Question Answering over Knowledge GraphsDownload PDF

Published: 23 Feb 2021, Last Modified: 17 Nov 2024ESWC 2021 ResearchReaders: Everyone
Keywords: Conversational Question Answering, Knowledge Graph, Context Transformer, Stacked Pointer Networks
Abstract: Neural semantic parsing approaches have been widely used for Question Answering (QA) systems over knowledge graphs. Such methods provide the flexibility to handle QA datasets with complex queries and a large number of entities. In this work, we propose a novel framework named CARTON, which performs multi-task semantic parsing for handling the problem of conversational question answering over a large-scale knowledge graph. Our framework consists of a stack of pointer networks as an extension of a context transformer model for parsing the input question and the dialog history. The framework generates a sequence of actions that can be executed on the knowledge graph. We evaluate CARTON on a standard dataset for complex sequential question answering on which CARTON outperforms all baselines. Specifically, we observe performance improvements in F1-score on eight out of ten question types compared to the previous state of the art. For logical reasoning questions, an improvement of 11 absolute points is reached.
Subtrack: NLP and Information Retrieval
First Author Is Student: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/context-transformer-with-stacked-pointer/code)
8 Replies

Loading