Mixed-Memory RNNs for Learning Long-term Dependencies in Irregularly-sampled Time Series

TMLR Paper47 Authors

12 Apr 2022 (modified: 28 Feb 2023)Rejected by TMLREveryoneRevisionsBibTeX
Abstract: Recurrent neural networks (RNNs) with continuous-time hidden states are a natural fit for modeling irregularly-sampled time series. These models, however, face difficulties when the input data possess long-term dependencies. We prove that similar to standard RNNs, the underlying reason for this issue is the vanishing or exploding of the gradient during training. This phenomenon is expressed by the ordinary differential equation (ODE) representation of the hidden state, regardless of the ODE solver's choice. We provide a solution by equipping arbitrary continuous-time networks with a memory compartment separated from their time-continuous state. This way, we encode a continuous-time dynamical flow within the RNN, allowing it to respond to inputs arriving at arbitrary time-lags while ensuring a constant error propagation through the memory path. We call these models Mixed-Memory-RNNs (mmRNNs). We experimentally show that Mixed-Memory-RNNs outperform recently proposed RNN-based counterparts on non-uniformly sampled data with long-term dependencies.
Submission Length: Regular submission (no more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=rOGm97YR22N&referrer=[the%20profile%20of%20Ramin%20Hasani](/profile?id=~Ramin_Hasani1)
Changes Since Last Submission: We addressed all issues raised by the reviewers of ICLR 2022 as detailed under each review in the "Previous Submission Url" field.
Assigned Action Editor: ~Caglar_Gulcehre1
Submission Number: 47
Loading