Keywords: Deep Learning, Spatiotemporal Forecasting, Transformer, Graph Neural Network
Abstract: Time series data is ubiquitous and appears in all fields of study. In multivariate time series, observations are interconnected both temporally and across components. For instance, in traffic flow analysis, traffic speeds at different intersections exhibit complex spatiotemporal correlations. Modelling this dual structure poses significant challenges. Most existing forecasting methods tackle these challenges by separately learning spatial and temporal dependencies. In this work, we introduce T-Graphormer, a Transformer-based approach designed to model spatiotemporal correlations directly. Extending the Graphormer architecture to incorporate temporal dynamics, our method updates each node representation by selectively attending to all other nodes within a graph sequence. This design enables the model to capture rich spatiotemporal patterns with minimal reliance on predefined spacetime inductive biases. We validate the effectiveness of T-Graphormer on real-world traffic prediction benchmark datasets, achieving up to 10% reductions in both root mean squared error (RMSE) and mean absolute percentage error (MAPE) compared to state-of-the-art methods.
Primary Area: learning on time series and dynamical systems
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13412
Loading