A Memory-augmented Neural Network by Resembling Human Cognitive Process of MemorizationDownload PDF

25 Sep 2019 (modified: 24 Dec 2019)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
  • Original Pdf: pdf
  • Abstract: Memorization of long-term information is a core task in sequence learning of neural networks, and inspired by human cognitive process, we propose a sparse memory-augmented neural network (SMANN) to deal with it in this paper, which is composed of a two-layer neural controller and an external memory. In the first layer of the network, the information is divided into segments according to a sparse mask, which preserve immediate memory, and then in the second layer, the segmented information are collected and processed as short-term memory. For alleviation of gradient vanishing problem, constrained LSTM structures are utilized in both of the layers to make the chrono-initializer more reasonable. Lastly, the external memory is used to store long-term information, and its access rate is reduced sharply owing to the sparse mask. In experiments, we evaluate the network and its components like constrained LSTM and the neural controller independently, results on different tasks has demonstrated the superiorities of our networks compared with their counterparts.
4 Replies