MetaTST: Essential Transformer Components for Time Series Analysis

21 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Time Series, Transformer
Abstract: This paper presents MetaTST, a versatile time series Transformer architecture that combines standard Transformer components with time series-specific features, omitting the traditional token mixer in favor of non-parametric pooling operators. The study's two primary contributions include defining the MetaTST architecture and showcasing its empirical success across forecasting, classification, imputation, and anomaly detection tasks. These results establish MetaTST as a robust and adaptable foundation for future time series Transformer designs, raising important questions about the necessity of attention mechanisms in time series analysis.
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3664
Loading