Generalized Variational Inference in Function Spaces: Gaussian Measures meet Bayesian Deep LearningDownload PDF

Published: 31 Oct 2022, Last Modified: 10 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: Function Space Inference, Variational Inference, Bayesian Learning, Bayesian Inference, Generalised Variational Inference, Gaussian Processes, Gaussian Measures
TL;DR: Deep nets are not models to be made Bayesian, they are useful parametrisations for Bayesian models!
Abstract: We develop a framework for generalized variational inference in infinite-dimensional function spaces and use it to construct a method termed Gaussian Wasserstein inference (GWI). GWI leverages the Wasserstein distance between Gaussian measures on the Hilbert space of square-integrable functions in order to determine a variational posterior using a tractable optimization criterion. It avoids pathologies arising in standard variational function space inference. An exciting application of GWI is the ability to use deep neural networks in the variational parametrization of GWI, combining their superior predictive performance with the principled uncertainty quantification analogous to that of Gaussian processes. The proposed method obtains state-of-the-art performance on several benchmark datasets.
Supplementary Material: pdf
12 Replies

Loading