Omnibus Dropout for Improving The Probabilistic Classification Outputs of ConvNetsDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Keywords: Uncertainty Estimation, Calibration, Deep Learning
TL;DR: We propose to combine structured dropout methods at different scales for improved model diversity and performance of dropout uncertainty estimates.
Abstract: While neural network models achieve impressive classification accuracy across different tasks, they can suffer from poor calibration of their probabilistic predictions. A Bayesian perspective has recently suggested that dropout, a regularization strategy popularly used during training, can be employed to obtain better probabilistic predictions at test time (Gal & Ghahramani, 2016a). However, empirical results so far have not been encouraging, particularly with convolutional networks. In this paper, through the lens of ensemble learning, we associate this unsatisfactory performance with the correlation between the models sampled with dropout. Motivated by this, we explore the use of various structured dropout techniques to promote model diversity and improve the quality of probabilistic predictions. We also propose an omnibus dropout strategy that combines various structured dropout methods. Using the SVHN, CIFAR-10 and CIFAR-100 datasets, we empirically demonstrate the superior performance of omnibus dropout relative to several widely used strong baselines in addition to regular dropout. Lastly, we show the merit of omnibus dropout in a Bayesian active learning application.
Original Pdf: pdf
7 Replies

Loading