Keywords: Local Geometry, Local Gaussian Process, Transformer Architecture, Time Series Analysis, Corruption Benchmark
TL;DR: We propose LGA, the local geometry-aware attention mechanism based on the local Gaussian Process, and introduce TSRBench, the first benchmark for evaluating time series forecasting model robustness under realistic corruptions.
Abstract: Transformers have demonstrated strong performance in time series forecasting, yet they often fail to capture the intrinsic structure of temporal data, making them susceptible to real-world noise and anomalies. Unlike in vision or language, the local geometry of temporal patterns is a critical feature in time series forecasting, but it is frequently disrupted by corruptions.
In this work, we address this gap with two key contributions. First, we propose Local Geometry Attention (LGA), a novel attention mechanism theoretically grounded in local Gaussian process theory. LGA adapts to the intrinsic data geometry by learning query-specific distance metrics, enabling it to model complex temporal dependencies and enhance resilience to noise. Second, we introduce TSRBench, the first comprehensive benchmark for evaluating forecasting robustness under realistic, statistically-grounded corruptions.
Experiments on TSRBench show that LGA significantly reduces performance degradation, consistently outperforming both Transformer and linear model. These results establish a foundation for developing robust time series models that can be deployed in real-world applications where data quality is not guaranteed. Our code is available at: https://github.com/dongbeank/LGA.
Primary Area: learning on time series and dynamical systems
Submission Number: 5325
Loading