Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Guided Sequence-to-Sequence Learning with External Rule Memory
Jiatao Gu, Baotian Hu, Zhengdong Lu, Hang Li, Victor O.K. Li
Feb 18, 2016 (modified: Feb 18, 2016)ICLR 2016 workshop submissionreaders: everyone
Abstract:External memory has been proven to be essential for the success of neural network-based systems on many tasks, including Question-Answering, classification, machine translation and reasoning. In all those models the memory is used to store instance representations of multiple levels, analogous to “data” in the Von Neumann architecture of a computer, while the “instructions” are stored in the weights. In this paper, we however propose to use the memory for storing part of the instructions, and more specifically, the transformation rules in sequence-to-sequence learning tasks, in an external memory attached to a neural system. This memory can be accessed both by the neural network and by the human experts, hence serving as an interface for a novel learning paradigm where not only the instances but also the rule can be taught to the neural network. Our empirical study on a synthetic but challenging dataset verifies that our model is effective.
Enter your feedback below and we'll get back to you as soon as possible.