Persistence-based Contrastive Learning with Graph Neural Recurrent Networks for Time-series Forecasting
Keywords: Spatio-temporal forecasting, graph neural network, topological data analysis, contrastive learning
Abstract: In the recent years, combinations of graph convolution and recurrent architectures have emerged as a new powerful alternative for multivariate spatio-temporal forecasting, with applications ranging from biosurveillance to traffic monitoring. However, such methods often tend to suffer from vulnerability to noise and limited generalization abilities, especially when semantics and structural properties of time series evolve over time. To address these limitations, we propose a simple yet flexible and highly effective framework, i.e., Persistence-based Contrastive Learning with Graph Neural Recurrent Networks (PCL-GCRN). The key idea behind PCL-GCRN is the notion of topological invariance that we introduce to contrastive graph learning for multivariate spatio-temporal processes. PCL-GCRN allows us to simultaneously focus on multiple most important data shape characteristics at different granularities that play the key role in the learning performance. As a result, PCL-GCRN leads to richer data augmentation, improved performance, and enhanced robustness. Our extensive experiments on a broad range of real-world datasets, from spatio-temporal forecasting of traffic to monkeypox surveillance, suggest that PCL-GCRN yields competitive results both in terms of prediction accuracy and robustness, outperforming 19 competing approaches.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Deep Learning and representational learning
Supplementary Material: zip
4 Replies
Loading