Hi-Patch: Hierarchical Patch GNN for Irregular Multivariate Time Series Modeling

23 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: irregular multivariate time series, graph neural network
Abstract: Multi-scale information is crucial for multivariate time series modeling. However, most existing time series multi-scale analysis methods treat all variables in the same manner, which is not well adaptive to Irregular Multivariate Time Series (IMTS), where different variables have distinct original scales/sampling rates. Therefore, extracting temporal and inter-variable dependencies at multiple scales in IMTS remains challenging. To fill this gap, we propose a hierarchical patch graph network Hi-Patch. The key components of Hi-Patch are an intra-patch graph layer and several inter-patch graph layers. The intra-patch graph layer flexibly represents and fully captures both the local temporal and inter-variable dependencies of densely sampled variables at the original scale by employing fully connected graph networks within each patch, and obtains patch-level node representations through aggregation. Subsequently, several inter-patch graph layers are stacked to form a hierarchical architecture, where each layer updates specific patch-level nodes through scale-specific graph networks, progressively representing and extracting more global temporal and inter-variable features of both sparsely and densely sampled variables, and further aggregating to produce the next patch-level node representations. The output of the last inter-patch graph layer is fed into task-specific decoders to adapt to different downstream tasks. Experimental results on 8 datasets show that Hi-Patch outperforms a range of state-of-the-art models in both IMTS forecasting and classification tasks. Code is available at this repository: https://anonymous.4open.science/r/Hi-Patch-F42E.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2938
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview