Cont-GRU: Fully Continuous Gated Recurrent Units for Irregular Time Series

21 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Time-series, Continuous GRU
TL;DR: Previous GRU-based models are piece-wise continuous. We proposed the first fully continuous GRU.
Abstract: For a long time, RNN-based models, such as RNNs, LSTMs and GRUs, have been used to process time series. However, RNN-based models do not fit well with sporadically (or irregularly) observed real-world data. To this end, some methods \emph{partially} continuously model RNNs/GRUs using ordinary differential equations (ODEs). In this paper, however, we propose Cont-GRU, which models GRUs as delay differential equations (DDEs). By redefining GRUs as DDEs, we show that i) all the parts of GRUs (the hidden state, the reset gate, the update gate, and the update vector) can be interpreted \emph{fully} continuously, and ii) our method does not inherit the limitations of ODEs. In our experiments using 5 real-world datasets and 17 baselines, Cont-GRU outperforms all baselines by non-trivial margins.
Supplementary Material: pdf
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3539
Loading