Network calibration by weight scalingDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: network calibration, temperature scaling, Expected Calibration Error (ECE)
Abstract: Calibrating neural networks is crucial in applications where the decision making depends on the predicted probabilities. Modern neural networks are not well calibrated and they tend to overestimate probabilities when compared to the expected accuracy. This results in a misleading reliability that corrupts our decision policy. We define a weight scaling calibration method that computes a convex combination of the network output class distribution and the uniform distribution. The weights controls the confidence of the calibrated prediction. Since the goal of calibration is making the confidence prediction more accurate, the most suitable weight is found as a function of the given confidence. We derive an optimization method that is based on a closed form solution for the optimal weight scaling in each bin of a discretized value of the prediction confidence. We report extensive experiments on a variety of image datasets and network architectures. This approach achieves state-of-the-art calibration with a guarantee that the classification accuracy is not altered.
One-sentence Summary: A network calibration method based on a convex combination of the network output and the uniform distribution.
5 Replies

Loading