Context Transformer with Stacked Pointer Networks for Conversational Question Answering over Knowledge GraphsDownload PDF

Dec 11, 2020 (edited Mar 16, 2021)ESWC 2021 ResearchReaders: Everyone
  • Keywords: Conversational Question Answering, Knowledge Graph, Context Transformer, Stacked Pointer Networks
  • Abstract: Neural semantic parsing approaches have been widely used for Question Answering (QA) systems over knowledge graphs. Such methods provide the flexibility to handle QA datasets with complex queries and a large number of entities. In this work, we propose a novel framework named CARTON, which performs multi-task semantic parsing for handling the problem of conversational question answering over a large-scale knowledge graph. Our framework consists of a stack of pointer networks as an extension of a context transformer model for parsing the input question and the dialog history. The framework generates a sequence of actions that can be executed on the knowledge graph. We evaluate CARTON on a standard dataset for complex sequential question answering on which CARTON outperforms all baselines. Specifically, we observe performance improvements in F1-score on eight out of ten question types compared to the previous state of the art. For logical reasoning questions, an improvement of 11 absolute points is reached.
  • First Author Is Student: Yes
  • Subtrack: NLP and Information Retrieval
8 Replies