Keywords: traffic speed forecasting, adaptive attention, selective state-space model, time series
TL;DR: We propose Tramba, a novel time-series forecasting framework that integrates a selective state-space model with an adaptive attention mechanism for enhanced predictive accuracy and dynamic spatial reasoning in complex urban networks.
Abstract: We introduce \textbf{Tramba}, a novel deep learning model for traffic speed forecasting in complex urban road networks. Unlike conventional methods that rely heavily on short-term trends or local spatial proximity (e.g., upstream and downstream links), Tramba captures dynamic, long-range dependencies across both time and space. It does so by integrating two key components: a Mamba-based temporal encoder that models long-term historical patterns of the target link, and an adaptive attention mechanism that learns temporally similar patterns from non-adjacent road links across the network. We evaluate Tramba on a real-world dataset from Gangnam-gu, Seoul, comprising 5-minute interval speed measurements across 366 road segments. Tramba is tested over forecasting horizons from 1 to 36 steps and compared with six strong baselines. It consistently outperforms all alternatives, achieving an average MAPE of 10.17\%, MAE of 2.80~km/h, and MSE of 20.50~(km/h)$^2$ on the three datasets for 12-step forecasting. These results highlight Tramba’s ability to model long-range dependencies and detect non-local influences in complex urban networks, reducing prediction lag and improving robustness in dynamic traffic conditions. Code is available at~\url{https://github.com/tr-anon-users/tramba-code}.
Primary Area: learning on time series and dynamical systems
Submission Number: 16390
Loading