Keywords: Bayesian deep learning, variational inference, heteroscedastic uncertainty, bayesian optimization
TL;DR: We introduce a highly scalable and performant variational deep learning approach for heteroscedastic settings
Abstract: We present a simple, inexpensive, and effective method for heteroscedastic uncertainty quantification in neural networks. We build on Variational Bayesian Last Layers (VBLL), wherein deterministic training objectives are developed for variational inference of the network last layer. In particular, we (1) Introduce t-VBLL layers, which perform variational inference for the aleatoric noise covariance, and (2) Introduce Het-VBLL, a Bayesian last layer scheme to model heteroscedastic noise. These methods are based on novel, analytically tractable evidence lower bounds. We further discuss parameterization and initialization within these models. We show that these novel design elements enable effective uncertainty modeling at minimal additional cost, and substantially improve performance over similar methods such as VBLLs.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 21785
Loading