Omni-Scale CNNs: a simple and effective kernel size configuration for time series classificationDownload PDF

29 Sept 2021, 00:31 (edited 13 Mar 2022)ICLR 2022 PosterReaders: Everyone
  • Keywords: Time series classification
  • Abstract: The size of the receptive field has been one of the most important factors for One Dimensional Convolutional Neural Networks (1D-CNNs) on time series classification tasks. Large efforts have been taken to choose the appropriate receptive field size, for it has a huge influence on the performance and differs significantly for each dataset. In this paper, we propose an Omni-Scale block (OS-block) for 1D-CNNs, where the kernel sizes are set by a simple and universal rule. OS-block can efficiently cover the best size of the receptive field across different datasets. This set of kernel sizes consists of multiple prime numbers according to the length of the time series. We experimentally show 1D-CNNs built from OS-block can consistently achieve the state-of-the-art accuracy with a smaller model size on five time series benchmarks, including both univariate and multivariate data from multiple domains. Comprehensive analysis and ablation studies shed light on how our rule finds the best receptive field size and demonstrate the consistency of our OS-block for multiple 1D-CNN structures.
  • One-sentence Summary: To extract features from time series data in proper time scales, many complicated scales searching or weighting methods have been proposed, but we will show that this could have been done via a very simple structure.
  • Supplementary Material: zip
10 Replies