Effects of Neural Network Parameter Pruning on Uncertainty Estimation

TMLR Paper2386 Authors

18 Mar 2024 (modified: 17 Sept 2024)Withdrawn by AuthorsEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Obtaining uncertainty estimates for deep neural network predictions is a particularly interesting problem for safety-critical applications. In addition, uncertainty estimation for compressed neural networks is a problem that is well-suited to real-world applications. At the same time, the combination of uncertainty estimation with parameter pruning is far from being well understood. In this work, we present a study on the influence of parameter pruning on the uncertainty estimation of a deep neural network for image classification. We compress two popular image classification networks with five pruning approaches and investigate the uncertainty estimation next to the standard accuracy metric after pruning. To measure the uncertainty performance, we propose a new version of the Area Under the Sparsification Error (AUSE) for image classification and additionally evaluate using the excepted calibration error (ECE). We study the influence of pruning on the standard uncertainty estimation methods maximum softmax probability, Monte Carlo dropout, and deep ensembles using two datasets, namely CIFAR-100 and ImageNet. The analysis shows that pruning affects, besides the class prediction, also the uncertainty estimation performance of a neural network in a negative manner. In general, the uncertainty estimation performance decreases with an increasing pruning sparsity, similar to the class prediction. Noticeably, in some special cases, the pruning can improve the neural network’s uncertainty estimation. Our code will be published after acceptance.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Bruno_Loureiro1
Submission Number: 2386
Loading