Uncertainty for deep image classifiers on out of distribution data. Download PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: uncertainty, confidence, out of distribution, outlier exposure, classification
Abstract: In addition to achieving high accuracy, in many applications, it is important to estimate the probability that a model prediction is correct. Predictive uncertainty is particularly important on out of distribution (OOD) data where accuracy degrades. However, models are typically overconfident, and model calibration on OOD data remains a challenge. In this paper we propose a simple post hoc calibration method that significantly improves on benchmark results [Ovadia et al 2019] on a wide range of corrupted data. Our method uses outlier exposure to properly calibrate the model probabilities.
One-sentence Summary: improving on benchmark estimates of model uncertainty on OOD data, using outlier exposure
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=nmV6t8HIN
12 Replies

Loading