Divide-and-Conquer Time Series Forecasting with Auto-Frequency-Correlation via Cross-Channel Attention

21 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Time series forecasting, deep learning
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: To model various short-term temporal variations, we propose an effective design of Transformer-based, termed FreCoformer. FreCoformer is designed on top of the frequency domain and comprises three key designs: frequency patching operation and two independent observations of these patches. The patching process refines the frequency information, enhancing the locality. The subsequent observations extract the consistent representation within different channels by attention computation and summarize the relevant sub-frequencies to identify eventful frequency correlations for short-term variations. To improve the data fit for different time series scenarios, we propose a divide-and-conquer framework and introduce a simple linear projection-based module, incorporated into FreCoformer. These modules learn both long-term and short-term temporal variations of time series by observing their changes in the time and frequency domains. Extensive experiments show the effectiveness of our proposal can outperform other baselines in different real-world time series datasets. We further introduce a lightweight variant of FreCoformer with attention matrix approximation, which achieves comparable performance but with much fewer parameters and computation costs.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: pdf
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3399
Loading