Collapsed Inference for Bayesian Deep Learning

Published: 19 Jun 2023, Last Modified: 28 Jul 20231st SPIGM @ ICML OralEveryoneRevisionsBibTeX
Keywords: Bayesian Deep Learning, Uncertainty Quantification, Bayesian Model Averaging, Probabilistic Inference, SMT Constraints, Weighted Model Integration
TL;DR: We reveal a previously unseen connection between inference on Bayesian neural networks and volume computation problems, based on which we introduce a novel collapsed inference scheme that performs Bayesian model averaging using collapsed samples.
Abstract: Bayesian neural networks~(BNNs) provide a formalism to quantify and calibrate uncertainty in deep learning. Current inference approaches for BNNs often resort to few-sample estimation for scalability, which can harm predictive performance, while its alternatives tend to be computationally prohibitively expensive. We tackle this challenge by revealing a previously unseen connection between inference on BNNs and volume computation problems. With this observation, we introduce a novel collapsed inference scheme that performs Bayesian model averaging using collapsed samples. It improves over a Monte-Carlo sample by limiting sampling to a subset of the network weights while pairing it with some closed-form conditional distribution over the rest. A collapsed sample represents uncountably many models drawn from the approximate posterior and thus yields higher sample efficiency. Further, we show that the marginalization of a collapsed sample can be solved analytically and efficiently despite the non-linearity of neural networks by leveraging existing volume computation solvers. Our proposed use of collapsed samples achieves a balance between scalability and accuracy. On various regression and classification tasks, our collapsed Bayesian deep learning approach demonstrates significant improvements over existing methods and sets a new state of the art in terms of uncertainty estimation and predictive performance.
Submission Number: 102
Loading