A recurrent neural network without chaosDownload PDF

Published: 06 Feb 2017, Last Modified: 05 May 2023ICLR 2017 PosterReaders: Everyone
Abstract: We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves performance comparable to well-known gated architectures, such as LSTMs and GRUs, on the word-level language modeling task. We prove that our model has simple, predicable and non-chaotic dynamics. This stands in stark contrast to more standard gated architectures, whose underlying dynamical systems exhibit chaotic behavior.
Conflicts: lmu.edu, csulb.edu
16 Replies

Loading