TimeStacker: A Novel Framework with Multilevel Observation for Capturing Nonstationary Patterns in Time Series Forecasting
Abstract: Real-world time series inherently exhibit significant non-stationarity, posing substantial challenges for forecasting. To address this issue, this paper proposes a novel prediction framework, TimeStacker, designed to overcome the limitations of existing models in capturing the characteristics of non-stationary signals. By employing a unique stacking mechanism, TimeStacker effectively captures global signal features while thoroughly exploring local details. Furthermore, the framework integrates a frequency-based self-attention module, significantly enhancing its feature modeling capabilities. Experimental results demonstrate that TimeStacker achieves outstanding performance across multiple real-world datasets, including those from the energy, finance, and weather domains. It not only delivers superior predictive accuracy but also exhibits remarkable advantages with fewer parameters and higher computational efficiency.
Lay Summary: Due to the time-frequency uncertainty principle, the frequency of a non-stationary signal at a specific moment cannot be precisely determined. This raises the question of how the temporal evolution of frequency in such signals can be effectively captured.
We propose the TimeStacker framework, which captures the temporal evolution of frequency in non-stationary signals by combining windows of different sizes.
Our study introduces a new perspective on time series forecasting from a time-frequency variation perspective, achieving better performance while using fewer parameters.
Primary Area: Deep Learning->Sequential Models, Time series
Keywords: deep learning, time series forecasting, non-stationarity, attention
Submission Number: 5310
Loading