Let the noise flow away: combating noisy labels using normalizing flows

Published: 01 Jan 2025, Last Modified: 25 Jan 2025Mach. Learn. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We introduce NoiseFlow, a generative network that addresses the issue of noisy labels in classification problems by modeling the entire label distribution based on the input data/image. Unlike previous methods, which assign each input to only one specific class, NoiseFlow generates different labels by considering the image and a random noise drawn from a standard normal distribution. This approach improves generalization performance since it does not require extensive parameter adjustments to fit the unknown data noise. To model the label distribution, we use conditional normalizing flows, which are effective at avoiding mode collapse and ensuring the presence of the correct label in the distribution for accurate classification. Moreover, NoiseFlow can be combined with other training strategies, such as mixup interpolation and contrastive learning, to achieve even better performance. We compared NoiseFlow with baseline methods on several synthetic and real-world datasets, and the experiment results demonstrate its effectiveness.
Loading