Dynamic Neural Turing Machine with Continuous and Discrete Addressing SchemesDownload PDF

25 Apr 2024 (modified: 21 Jul 2022)Submitted to ICLR 2017Readers: Everyone
Abstract: In this paper, we extend neural Turing machine (NTM) into a dynamic neural Turing machine (D-NTM) by introducing a trainable memory addressing scheme. This addressing scheme maintains for each memory cell two separate vectors, content and address vectors. This allows the D-NTM to learn a wide variety of location-based addressing strategies including both linear and nonlinear ones. We implement the D-NTM with both continuous, differentiable and discrete, non-differentiable read/write mechanisms. We investigate the mechanisms and effects for learning to read and write to a memory through experiments on Facebook bAbI tasks using both a feedforward and GRU-controller. The D-NTM is evaluated on a set of Facebook bAbI tasks and shown to outperform NTM and LSTM baselines. We also provide further experimental results on sequential MNIST, associative recall and copy tasks.
TL;DR: We propose a new type of Neural Turing Machine, which is simpler than the original model and achieves better results than the baselines on non-trivial tasks.
Conflicts: umontreal.ca, nyu.edu, twitter.com
Keywords: Deep learning, Natural language processing, Reinforcement Learning
18 Replies

Loading