Evaluating and Calibrating Uncertainty Prediction in Regression Tasks

Anonymous

Sep 25, 2019 Blind Submission readers: everyone Show Bibtex
  • Keywords: Uncertainty Estimation, Regression, Deep learning
  • TL;DR: We propose a new definition for calibrated uncertainty prediction in regression tasks and a method for uncertainty calibration
  • Abstract: Predicting not only the target but also an accurate measure of uncertainty is important for many applications and in particular safety-critical ones. In this work we study the calibration of uncertainty prediction for regression tasks which often arise in real-world systems. We show that the existing definition for calibration of a regression uncertainty [Kuleshov et al. 2018] has severe limitations in distinguishing informative from non-informative uncertainty predictions. We propose a new definition that escapes this caveat and an evaluation method using a simple histogram-based approach inspired by reliability diagrams used in classification tasks. Our method clusters examples with similar uncertainty prediction and compares the prediction with the empirical uncertainty on these examples. We also propose a simple scaling-based calibration that preforms well in our experimental tests. We show results on both a synthetic, controlled problem and on the object detection bounding-box regression task using the COCO and KITTI datasets.
0 Replies

Loading