The Resistance to Label Noise in $K$-NN and DNN Depends on its ConcentrationDownload PDF

10 Jun 2020 (modified: 05 May 2023)Submitted to ICML Artemiss 2020Readers: Everyone
TL;DR: This work draws a connection between KNN and DNN and uses it to provide an explanation for DNN robustness to label noise in various noise settings.
Keywords: Label noise, KNN, robustness
Abstract: We investigate the classification performance of $K$-nearest neighbors ($K$-NN) and deep neural networks (DNNs) in the presence of label noise. We first show empirically that a DNN’s prediction for a given test example depends on the labels of the training examples in its local neighborhood. This motivates us to derive a realizable analytic expression that approximates the multi-class $K$-NN classification error in the presence of label noise, which is of independent importance. We then suggest that the expression for $K$-NN may serve as a first-order approximation for the DNN error. Finally, we demonstrate empirically the proximity of the developed expression to the observed performance of DNN. Our result may explain an important factor in DNN robustness to label noise by showing that the less concentrated the noise the greater is the network resistance to it.
0 Replies

Loading