Learning Operations on a Stack with Neural Turing Machines

Tristan Deleu

Oct 14, 2016 (modified: Oct 14, 2016) NIPS 2016 workshop NAMPI submission readers: everyone
  • Abstract: Multiple extensions of Recurrent Neural Networks (RNNs) have been proposed recently to address the difficulty of storing information over long time periods. In this paper, we experiment with the capacity of Neural Turing Machines (NTMs) to deal with these long-term dependencies on well-balanced strings of parentheses. We show that not only does the NTM emulate a stack with its heads and learn an algorithm to recognize such words, but it is also capable of strongly generalizing to much longer sequences.
  • TL;DR: We show how the NTM can emulate a stack to recognize well-balanced strings of parentheses, enabling strong generalization to longer sequences.
  • Conflicts: snips.ai
  • Keywords: Deep learning, Supervised Learning