Scale-teaching: Robust Multi-scale Training for Time Series Classification with Noisy Labels

Published: 21 Sept 2023, Last Modified: 03 Jan 2024NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: time series classification, deep neural networks, noisy labels
TL;DR: We propose a deep learning paradigm called Scale-teaching for time-series classification with noisy labels.
Abstract: Deep Neural Networks (DNNs) have been criticized because they easily overfit noisy (incorrect) labels. To improve the robustness of DNNs, existing methods for image data regard samples with small training losses as correctly labeled data (small-loss criterion). Nevertheless, time series' discriminative patterns are easily distorted by external noises (i.e., frequency perturbations) during the recording process. This results in training losses of some time series samples that do not meet the small-loss criterion. Therefore, this paper proposes a deep learning paradigm called Scale-teaching to cope with time series noisy labels. Specifically, we design a fine-to-coarse cross-scale fusion mechanism for learning discriminative patterns by utilizing time series at different scales to train multiple DNNs simultaneously. Meanwhile, each network is trained in a cross-teaching manner by using complementary information from different scales to select small-loss samples as clean labels. For unselected large-loss samples, we introduce multi-scale embedding graph learning via label propagation to correct their labels by using selected clean samples. Experiments on multiple benchmark time series datasets demonstrate the superiority of the proposed Scale-teaching paradigm over state-of-the-art methods in terms of effectiveness and robustness.
Supplementary Material: pdf
Submission Number: 2155
Loading