Learning Spatio-Temporal Representation for Multivariate Time Series

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: multivariate time series, contrastive learning, representation learning, spatio-temporal consistency
TL;DR: We propose a novel contrastive representation learning method that encourages a spatio-temporal consistency to reflect the spatial structure of multivariate time series along with its temporal dependency.
Abstract: Label sparsity renders the use of the label information of multivariate time series (MTS) challenging in practice. Thus, unsupervised representation learning methods have been studied to learn effective representations of MTS without label information. Recently, many studies have employed contrastive learning to generate robust representations by capturing underlying information about MTS. However, they have some limitations, such as the insufficient consideration of relationships between the variables of MTS and high sensitivity to positive pairs. We proposed a novel spatio-temporal contrastive representation learning method (STCR) for generating effective MTS representations suitable for classification and forecasting tasks. STCR learns representations by encouraging spatio-temporal consistency, which comprehensively reflects the spatial information and temporal dependency of MTS and simultaneously mitigates sensitivity to constructing positive pairs for contrastive learning. The results of extensive experiments on MTS classification and forecasting tasks demonstrate the efficacy of STCR in generating high-quality representations and state-of-the-art performance on both tasks.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4625
Loading