A recurrent neural network without chaosDownload PDF

Jun 20, 2021 (edited Feb 21, 2017)ICLR 2017 conference submissionReaders: Everyone
  • Abstract: We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves performance comparable to well-known gated architectures, such as LSTMs and GRUs, on the word-level language modeling task. We prove that our model has simple, predicable and non-chaotic dynamics. This stands in stark contrast to more standard gated architectures, whose underlying dynamical systems exhibit chaotic behavior.
  • Conflicts: lmu.edu, csulb.edu
16 Replies

Loading