Graph and Question Interaction Aware Graph2Seq Model for Knowledge Base Question GenerationDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 19 Jun 2023IJCNN 2022Readers: Everyone
Abstract: The Knowledge Base Question Generation (KBQG) is an essential natural language processing task. Taking knowledge graph and answer entities as input, KBQG aims to generate corresponding natural language question. Recently Graph2Seq has been proposed to encode the knowledge graph and achieved remarkable results, while one important challenge still remains, i.e., the graph encoding lacks the interaction with the target question. To deal with the above challenge, we propose a graph and question interaction enhanced Graph2Seq model, in which we design an encoder-decoder parallel enhancement mechanism and apply the knowledge distillation for both inter-mediate representation and prediction distribution to employ the knowledge of the target question into the graph representation. Experiments have been conducted on KBQG benchmark dataset and experimental results have shown the promising potential of proposed method.
0 Replies

Loading