Turing Completeness of Bounded-Precision Recurrent Neural NetworksDownload PDF

21 May 2021, 20:47 (edited 27 Oct 2021)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: Recurrent Neural Network, Stack Recurrent Neural Network, Neural Turing Machine, Turing Complete, Turing machine, Memory
  • TL;DR: We propose a novel growing memory module and show the existence of a 54-neuron bounded-precision RNN with growing memory modules that is Turing-complete.
  • Abstract: Previous works have proved that recurrent neural networks (RNNs) are Turing-complete. However, in the proofs, the RNNs allow for neurons with unbounded precision, which is neither practical in implementation nor biologically plausible. To remove this assumption, we propose a dynamically growing memory module made of neurons of fixed precision. The memory module dynamically recruits new neurons when more memories are needed, and releases them when memories become irrelevant. We prove that a 54-neuron bounded-precision RNN with growing memory modules can simulate a Universal Turing Machine, with time complexity linear in the simulated machine's time and independent of the memory size. The result is extendable to various other stack-augmented RNNs. Furthermore, we analyze the Turing completeness of both unbounded-precision and bounded-precision RNNs, revisiting and extending the theoretical foundations of RNNs.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
16 Replies

Loading