Keywords: deep k-nn, label smoothing, out-of-distribution detection, robustness
Abstract: Detecting out-of-distribution (OOD) examples is critical in many applications. We propose an unsupervised method to detect OOD samples using a $k$-NN density estimate with respect to a classification model's intermediate activations on in-distribution samples. We leverage a recent insight about label smoothing, which we call the {\it Label Smoothed Embedding Hypothesis}, and show that one of the implications is that the $k$-NN density estimator performs better as an OOD detection method both theoretically and empirically when the model is trained with label smoothing. Finally, we show that our proposal outperforms many OOD baselines and we also provide new finite-sample high-probability statistical results for $k$-NN density estimation's ability to detect OOD examples.
One-sentence Summary: We propose an unsupervised method to detect out-of-distribution samples using a $k$-NN density estimate with respect to a classification model's intermediate activations on in-distribution samples.
5 Replies
Loading