Unification of Recurrent Neural Network Architectures and Quantum Inspired Stable Design Download PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: Various architectural advancements in the design of recurrent neural networks~(RNN) have been focusing on improving the empirical stability and representability by sacrificing the complexity of the architecture. However, more remains to be done to fully understand the fundamental trade-off between these conflicting requirements. Towards answering this question, we forsake the purely bottom-up approach of data-driven machine learning to understand, instead, the physical origin and dynamical properties of existing RNN architectures. This facilitates designing new RNNs with smaller complexity overhead and provable stability guarantee. First, we define a family of deep recurrent neural networks, $n$-$t$-ORNN, according to the order of nonlinearity $n$ and the range of temporal memory scale $t$ in their underlying dynamics embodied in the form of discretized ordinary differential equations. We show that most of the existing proposals of RNN architectures belong to different orders of $n$-$t$-ORNNs. We then propose a new RNN ansatz, namely the Quantum-inspired Universal computing Neural Network~(QUNN), to leverage the reversibility, stability, and universality of quantum computation for stable and universal RNN. QUNN provides a complexity reduction in the number of training parameters from being polynomial in both data and correlation time to only linear in correlation time. Compared to Long-Short-Term Memory (LSTM), QUNN of the same number of hidden layers facilitates higher nonlinearity and longer memory span with provable stability. Our work opens new directions in designing minimal RNNs based on additional knowledge about the dynamical nature of both the data and different training architectures.
Keywords: theory and analysis of RNNs architectures, reversibe evolution, stability of deep neural network, learning representations of outputs or states, quantum inspired embedding
TL;DR: We provide theoretical proof of various recurrent neural network designs representable dynamics' nonlinearity and memory scale, and propose a new RNN ansatz inspired by quantum physics.
5 Replies

Loading