Rethinking Function-Space Variational Inference in Bayesian Neural NetworksDownload PDF

Published: 21 Dec 2020, Last Modified: 05 May 2023AABI2020Readers: Everyone
Keywords: Variational Inference, Bayesian Neural Networks, Function-Space Methods
TL;DR: The paper proposes a scalable approach to function-space variational inference in Bayesian neural networks.
Abstract: Bayesian neural networks (BNNs) define distributions over functions induced by distributions over parameters. In practice, this model specification makes it difficult to define and use meaningful prior distributions over functions that could aid in training. What's more, previous attempts at defining an explicit function-space variational objective for approximate inference in BNNs require approximations that do not scale to high-dimensional data. We propose a new function-space approach to variational inference in BNNs and derive a tractable variational by linearizing the BNN's posterior predictive distribution about its mean parameters, allowing function-space variational inference to be scaled to large and high-dimensional datasets. We evaluate this approach empirically and show that it leads to models with competitive predictive accuracy and significantly improved predictive uncertainty estimates compared to parameter-space variational inference.
1 Reply

Loading