Incorporating loose-structured knowledge into conversation modeling via recall-gate LSTMDownload PDFOpen Website

2017 (modified: 15 Dec 2021)IJCNN 2017Readers: Everyone
Abstract: It is critical for automatic chat-bots to gain the ability of conversation comprehension, which is the essence to provide context-aware responses to conduct smooth dialogues with human beings. As the basis of this task, conversation modeling will notably benefit from the background knowledge, since such knowledge indeed implicates semantic hints that help to further clarify the relationships between sentences within a conversation. In this paper, a deep neural network is proposed to incorporate background knowledge for conversation modeling. Through a recall mechanism with a specially designed recall-gate, background knowledge as global memory can be motivated to cooperate with local cell memory of Long Short-Term Memory (LSTM), so as to enrich the ability of LSTM to capture the implicit semantic clues in conversations. In addition, this paper introduces the loose-structured domain knowledge as background knowledge, which can be built with slight amount of manual work and easily adopted by the recall-gate. Our model is evaluated on the context-oriented response selecting task, and experimental results on two datasets have shown that our approach is promising for modeling conversations and building key components of automatic chat systems.
0 Replies

Loading