Can Label-Noise Transition Matrix Help to Improve Sample Selection and Label Correction?Download PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Existing methods for learning with noisy labels can be generally divided into two categories: (1) sample selection and label correction based on the memorization effect of neural networks; (2) loss correction with the transition matrix. So far, the two categories of methods have been studied independently because they are designed according to different philosophies, i.e., the memorization effect is a property of the neural networks independent of label noise while the transition matrix is exploited to model the distribution of label noise. In this paper, we take a first step in unifying these two paradigms by showing that modelling the distribution of label noise with the transition matrix can also help sample selection and label correction, which leads to better robustness against different types of noise. More specifically, we first train a network with the loss corrected by the transition matrix and then use the confidence of the estimated clean class posterior from the network to select and re-label instances. Our proposed method demonstrates strong robustness on multiple benchmark datasets under various types of noise.
One-sentence Summary: We have proposed a calibrated sample selection and label correction method by exploiting the noise transition matrix.
Supplementary Material: zip
6 Replies

Loading