Connecting the Patches: Multivariate Long-term Forecasting using Graph and Recurrent Neural Network

18 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: multivariate time series forecasting, Transformer, graph neural network, temporal order information
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose GRformer, a Transformer-based model that extracts cross-channel dependenies and temporal order information for multivariate time series forecasting.
Abstract: Many Transformer-based models have achieved great performance on multivariate long-term time series forecasting (MLTSF) tasks in the past few years, but they are ineffective in capturing cross-channel dependencies and temporal order information. In multivariate time series analysis, the cross-channel dependencies can help the model understand the correlations between multivariate time series, and the consistency of time series is also essential for more accurate predictions. Therefore, we propose GRformer, adopting the Graph neural network (GNN) and position encoding based on recurrent neural network (RNN) to better process multivariate time series data. We design a mix-hop propagation layer and embed it in the feedforward neural network to encourage proper interaction between different time series. To introduce temporal order information, we use a multi-layer RNN to recursively generate positional embeddings for sequence elements. Experiments on eight real-world datasets show that our model can achieve more accurate predictions on MLTSF tasks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1208
Loading