Periodic and Random Sparsity for Multivariate Long-Term Time-Series Forecasting

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Time series forecasting, Transformers, Efficiency
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: To attain efficiency in Transformers that show valid performance in long-term time series forecasting, we introduce periodic and random sparsity in attention.
Abstract: For years, Transformers have achieved remarkable success in various domains such as language and image processing. Due to their capabilities to capture long-term relationships, they are expected to give potential benefits in multivariate long-term time-series forecasting. Recent works have proposed segment-based Transformers, where each token is represented by a group of consecutive observations rather than a single one. However, the quadratic complexity of self-attention leads to intractable costs under high granularity and large feature size. In response, we propose Efficient Segment-based Sparse Transformer (ESSformer), which incorporates two sparse attention modules tailored for segment-based Transformers. To efficiently capture temporal dependencies, ESSformer utilizes Periodic Attention (PeriA), which learns interactions between periodically distant segments. Furthermore, inter-feature dependencies are captured via Random-Partition Attention (R-PartA) and ensembling, which leads to additional cost reduction. Our empirical studies on real-world datasets show that ESSformer surpasses the forecasting capabilities of various baselines while reducing the quadratic complexity.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5133
Loading