Keywords: entropy, uncertainty, out-of-distribution detection, distribution shift, density, distance
TL;DR: A computationally efficient method to perform distribution shift detection across computer vision tasks using latent representation entropy density.
Abstract: Distribution shift detection is paramount in safety-critical tasks that rely on Deep Neural Networks (DNNs). The detection task entails deriving a confidence score to assert whether a new input sample aligns with the training data distribution of the DNN model. While DNN predictive uncertainty offers an intuitive confidence measure, exploring uncertainty-based distribution shift detection with simple sample-based techniques has been relatively overlooked in recent years due to computational overhead and lower performance than plain post-hoc methods. This paper proposes using simple sample-based techniques for estimating uncertainty and employing the entropy density from intermediate representations to detect distribution shifts. We demonstrate the effectiveness of our method using standard benchmark datasets for out-of-distribution detection and across different common perception tasks with convolutional neural network architectures. Our scope extends beyond classification, encompassing image-level distribution shift detection for object detection and semantic segmentation tasks. Our results show that our method's performance is comparable to existing \textit{State-of-the-Art} methods while being computationally faster and lighter than other Bayesian approaches, affirming its practical utility.
Supplementary Material: zip
List Of Authors: Arnez, Fabio and Montoya Vasquez, Daniel Alfonso and Radermacher, Ansgar and Terrier, Francois
Latex Source Code: zip
Signed License Agreement: pdf
Code Url: https://github.com/CEA-LIST/LaREx
Submission Number: 417
Loading