Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and ForecastingDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 OralReaders: Everyone
Keywords: sparse attention, pyramidal graph, Transformer, time series forecasting, long-range dependence, multiresolution
Abstract: Accurate prediction of the future given the past based on time series data is of paramount importance, since it opens the door for decision making and risk management ahead of time. In practice, the challenge is to build a flexible but parsimonious model that can capture a wide range of temporal dependencies. In this paper, we propose Pyraformer by exploring the multiresolution representation of the time series. Specifically, we introduce the pyramidal attention module (PAM) in which the inter-scale tree structure summarizes features at different resolutions and the intra-scale neighboring connections model the temporal dependencies of different ranges. Under mild conditions, the maximum length of the signal traversing path in Pyraformer is a constant (i.e., $\mathcal O(1)$) with regard to the sequence length $L$, while its time and space complexity scale linearly with $L$. Extensive numerical results show that Pyraformer typically achieves the highest prediction accuracy in both single-step and long-range forecasting tasks with the least amount of time and memory consumption, especially when the sequence is long.
One-sentence Summary: We propose a multiresolution pyramidal attention mechanism for long-range dependence modeling and time series forecasting, successfully reducing the maximum length of the signal traversing path to O(1) while achieving linear time and space complexity
29 Replies

Loading