Memory Learning of Multivariate Asynchronous Time SeriesDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Multivariate Asynchronous Time Series, Gated Recurrent Unit, Sequential Models
TL;DR: Modeling Multivariate Asynchronous Time Series
Abstract: Sequential observations from complex systems are usually collected irregularly and asynchronously across variables. Besides, they are typically both serially and cross-sectionally dependent. Recurrent networks are always used to model such sequential data, trying to simultaneously capture marginal dynamics and dependence dynamics with one shared memory. This leads to two problems. First, some heterogeneous marginal information is difficult to be preserved in the shared memory. Second, in an asynchronous setting, missing values across variables will introduce bias in the shared memory. To solve these problems, this paper designs a new architecture that seamlessly integrates continuous-time ODE solvers with a set of memory-aware GRU blocks. It learns memory profiles separately and addresses the issue of asynchronous observations. Numerical results confirm that this new architecture outperforms a variety of state-of-the-art baseline models on datasets from various fields.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Applications (eg, speech processing, computer vision, NLP)
Supplementary Material: zip
5 Replies

Loading