Nonparametric Approach to Uncertainty Quantification for Deterministic Neural NetworksDownload PDF

09 Oct 2021, 14:49 (edited 30 Nov 2021)NeurIPS 2021 Workshop DistShift PosterReaders: Everyone
  • Keywords: Out-of-distribution detection, uncertainty quantification, epistemic uncertainty, aleatoric uncertainty, non-parametric models, Nadaraya-Watson estimator, Misclassification detection
  • TL;DR: A new scalable non-parametric uncertainty estimation method applicable to any neural network.
  • Abstract: This paper proposes a fast and scalable method for uncertainty quantification of machine learning models' predictions. First, we show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution. Importantly, the approach allows to disentangle explicitly \textit{aleatoric} and \textit{epistemic} uncertainties. The resulting method works directly in the feature space. However, one can apply it to any neural network by considering an embedding of the data induced by the network. We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets, such as MNIST, SVHN, CIFAR-100 and several versions of ImageNet.
1 Reply

Loading