everyone
since 13 Oct 2023">EveryoneRevisionsBibTeX
This paper presents MetaTST, a versatile time series Transformer architecture that combines standard Transformer components with time series-specific features, omitting the traditional token mixer in favor of non-parametric pooling operators. The study's two primary contributions include defining the MetaTST architecture and showcasing its empirical success across forecasting, classification, imputation, and anomaly detection tasks. These results establish MetaTST as a robust and adaptable foundation for future time series Transformer designs, raising important questions about the necessity of attention mechanisms in time series analysis.