Equal Loss: A Simple Loss Function for Noise Robust LearningDownload PDFOpen Website

2022 (modified: 17 Nov 2022)ICASSP 2022Readers: Everyone
Abstract: Training accurate deep neural networks in the presence of noisy labels is an important task. Though a number of approaches have been proposed for learning with noisy labels, many open issues remain. In this paper, we show that DNN learning with Cross Entropy is not robust to label noise and exhibits imbalance between the gradient of clean and noisy samples. We propose a new loss function, Equal Loss (EL), boosting DNN with a relaxed target probability and balanced gradient density. Both theoretical analysis and experiments on a range of benchmarks and real-world datasets show that EL outperforms state-of-the-art methods.
0 Replies

Loading