Noise against noise: stochastic label noise helps combat inherent label noiseDownload PDF

Published: 12 Jan 2021, Last Modified: 05 May 2023ICLR 2021 SpotlightReaders: Everyone
Keywords: Noisy Labels, Robust Learning, SGD noise, Regularization
Abstract: The noise in stochastic gradient descent (SGD) provides a crucial implicit regularization effect, previously studied in optimization by analyzing the dynamics of parameter updates. In this paper, we are interested in learning with noisy labels, where we have a collection of samples with potential mislabeling. We show that a previously rarely discussed SGD noise, induced by stochastic label noise (SLN), mitigates the effects of inherent label noise. In contrast, the common SGD noise directly applied to model parameters does not. We formalize the differences and connections of SGD noise variants, showing that SLN induces SGD noise dependent on the sharpness of output landscape and the confidence of output probability, which may help escape from sharp minima and prevent overconfidence. SLN not only improves generalization in its simplest form but also boosts popular robust training methods, including sample selection and label correction. Specifically, we present an enhanced algorithm by applying SLN to label correction. Our code is released.
One-sentence Summary: SGD noise induced by stochastic label noise helps escape sharp minima and prevents overconfidence, hence can mitigate the effects of inherent label noise and improve generalization.
Supplementary Material: zip
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Data: [Clothing1M](https://paperswithcode.com/dataset/clothing1m)
17 Replies

Loading