Abstract: Supervised deep learning has achieved tremendous success in many computer vision tasks, which however is prone to overfit noisy labels. To mitigate the undesirable influence of noisy labels, robust loss functions offer a feasible approach to achieve noise-tolerant learning. In this work, we systematically study the problem of noise-tolerant learning with respect to both classification and regression. Specifically, we propose a new class of loss function, namely \textit{asymmetric loss functions} (ALF), which are tailored to satisfy the Bayes-optimal condition and thus are robust to noisy labels. For classification, we investigate general theoretical properties of ALF on categorical noisy labels, and introduce the asymmetry ratio to measure the asymmetry of a loss function. We extend several commonly-used loss functions, and establish the necessary and sufficient conditions to make them asymmetric and thus noise robust. For regression, we extend the concept of noise-tolerant learning for image restoration with continuous noisy labels. We theoretically prove that $\ell_p$ loss ($p>0$) is noise-tolerant for targets with Gaussian noise. For targets with general noise, we introduce two losses as surrogates of $\ell_0$ loss that seeks the mode when clean pixels keep dominant. Experimental results demonstrate that ALF can achieve better or comparative performance compared with the state-of-the-arts.
0 Replies
Loading