LLGformer: Learnable Long-range Graph Transformer for Traffic Flow Prediction

Published: 29 Jan 2025, Last Modified: 29 Jan 2025WWW 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: Graph algorithms and modeling for the Web
Keywords: Traffic flow prediction, Transformer, Predictive model, Spatio-temporal graph
TL;DR: LLGformer is an advanced traffic flow prediction method grounded in a learnable global spatio-temporal graph, which also presents novel optimization strategies to enhance the model's training and inference efficiency.
Abstract: Traffic prediction plays a pivotal role in intelligent transportation systems. Most existing studies only predict traffic flow for a specific time period based on traffic data from a short period, such as an hour, overlooking the influence of periodicity present in traffic data. Moreover, most of the existing advanced methods rely on manually constructed spatio-temporal graphs for joint modeling, or use pure spatial and pure temporal modules to separately model spatial and temporal features, which limits the learning of complex spatio-temporal patterns in traffic data due to structural inadequacies in the model. To address these issues, we propose a novel approach by constructing a learnable long-range spatio-temporal graph, which can better capture complex patterns in traffic data. We introduce a new model, LLGformer, which improves upon traditional Transformer-style models, facilitating more efficient learning of traffic flow data by integrating long-range historical information. Leveraging attention mechanisms on a spatiotemporal graph enables direct interaction of information across different time slices and locations. Additionally, we propose two optimization strategies to further boost the speed of training and inference. Extensive experiments on four real-world datasets show that the new model significantly outperforms state-of-the-art methods.
Submission Number: 1341
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview