Abstract: Traffic prediction provides vital support for urban traffic management and has received extensive research interest. By virtue of the ability to effectively learn spatial and temporal dependencies from a global view, Transformers have achieved superior performance in long-term traffic prediction. However, existing methods usually underrate the complex spatio-temporal entanglement in long-range sequences. Compared with purely temporal entanglement, spatio-temporal data emphasizes the entangled dynamics under the restrictions of traffic networks, which brings additional difficulties. Moreover, the computational costs of spatio-temporal Transformers scale quadratically as the sequence length grows, limiting their applications on long-range and large-scale scenarios. To address these problems, we propose a decomposed spatio-temporal Mamba (DST-Mamba) for traffic prediction. We aim to apply temporal decomposition to the entangled sequences and obtain the seasonal and trend parts. Shifting from the temporal view to the spatial view, we leverage Mamba, a state space model with near-linear complexity, to capture seasonal variations in a node-centric manner. Meanwhile, multi-scale trend information is extracted and aggregated by simple linear layers. Such combination equips DST-Mamba with superior capability to model long-range spatio-temporal dependencies while remaining efficient compared with Transformers. Experimental results across five real-world datasets demonstrate that DST-Mamba can capture both local fluctuations and global trends within traffic patterns, achieving state-of-the-art performance with favorable efficiency.
Loading