Enhancing Transformer Efficiency for Multivariate Time Series ClassificationDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Abstract: The majority of current multivariate time series (MTS) classification algorithms aim to improve the predictive accuracy. However, when it comes to large-scale (either high-dimensional or long-sequential) time series (TS) datasets, it is crucial to design an efficient network architecture to reduce computational costs. In this work, we propose a mixing framework based on Transformer and Fourier transform. By pruning each module of the network separately and sequentially, we investigate the impact of each module on the predictive accuracy. We conduct comprehensive experiments on 18 benchmark MTS datasets. Ablation studies are used to evaluate the impact of each module. Through module-by-module pruning, our results demonstrate the trade-offs between efficiency and effectiveness, as well as efficiency and complexity of the network. Finally, we evaluate, via Pareto analysis, the trade-off between network efficiency and performance.
7 Replies

Loading