Training Noise Robust Deep Neural Networks with Self-supervised Learning

Published: 01 Jan 2023, Last Modified: 13 Nov 2024ADMA (4) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Training accurate deep neural networks (DNNs) on datasets with label noise is challenging for practical applications. The sample selection paradigm is a popular strategy that selects potentially clean data from noisy data for noise-robust training. In this study, we first analyze the sample selection models and find that the key aspects of this paradigm are the scale and degree of purity of the selected samples; however, these are restricted by the noise strength in the training set and the learning capacity of the models. Therefore, we propose a simple yet effective method called CoPL, which cross-trains two noise-robust DNNs simultaneously based on small loss criteria, along with learning accurate pseudo-labels. Benefitting from pseudo-labels, CoPL can reduce the noise strength and further promote the learning capacity and robustness of the models. Additionally, we discuss CoPL from the perspective of label smoothing to provide a theoretical guarantee of its performance. Extensive experimental results on both simulated (MNIST, CIFAR-10 and CIFAR-100) and real-world datasets (Clothing1M) demonstrate that CoPL is superior to other state-of-the-art methods and obtains a more noise-robust learning capacity.
Loading