A recurrent neural network without chaos

Thomas Laurent, James von Brecht

Nov 04, 2016 (modified: Feb 21, 2017) ICLR 2017 conference submission readers: everyone
  • Abstract: We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves performance comparable to well-known gated architectures, such as LSTMs and GRUs, on the word-level language modeling task. We prove that our model has simple, predicable and non-chaotic dynamics. This stands in stark contrast to more standard gated architectures, whose underlying dynamical systems exhibit chaotic behavior.
  • Conflicts: lmu.edu, csulb.edu

Loading