Early-Late Dropout for DivideMix: Learning with Noisy Labels in Deep Neural Networks

Published: 01 Jan 2024, Last Modified: 28 Sept 2024IJCNN 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Deep neural networks require labeling, and there has been much research devoted to reducing the cost of labeling when learning deep networks. Two prominent directions include learning with noisy labels and semi-supervised learning, where a classical model for noisy label learning is DivideMix with superior results. Recent studies have highlighted that employing dropout during the early and late stages of training can respectively address underfitting and overfitting issues. This strategy significantly enhances the model’s resilience to noisy labels and its ability to learn from clean labels. For this reason, we combine the early/late dropout with the DivideMix model to promote better fitting of the data by the model and avoid over-fitting noise early on. We conducted comprehensive experiments on the CIFAR-10/100 dataset under various noise levels and the real-world, noisy-labeled Clothing-1M dataset. The outcomes of these experiments significantly demonstrate the efficacy and robustness of our proposed scheme, particularly in practical scenarios with diverse noise characteristics.
Loading