Grounding Dialogue Systems via Knowledge Graph Aware Decoding with Pre-trained TransformersDownload PDF

Dec 09, 2020 (edited Mar 16, 2021)ESWC 2021 ResearchReaders: Everyone
  • Keywords: dialogue systems, knowledge integration, natural language processing, generative dialogue systems
  • Abstract: Generating knowledge grounded responses in both goal and non-goal oriented dialogue systems is an important research challenge. Knowledge Graphs (KG) can be viewed as an abstraction of the real world, which can potentially facilitate a dialogue system to produce knowledge grounded responses. However, integrating KGs into the dialogue generation process in an end-to-end manner is a non-trivial task. This paper proposes a novel architecture for integrating KGs into the response generation process by training a BERT model that learns to answer using the elements of the KG (entities and relations) in a multi-task, end-to-end setting. The k-hop subgraph of the KG is incorporated into the model during training and inference using Graph Laplacian. Empirical evaluation suggests that the model achieves better knowledge groundedness (measured via Entity F1 score) compared to other state-of-the-art models for both goal and non-goal oriented dialogues.
  • First Author Is Student: Yes
  • Subtrack: NLP and Information Retrieval
12 Replies