Preconditioned Crank-Nicolson Algorithms for Wide Bayesian Neural Networks

Published: 10 Oct 2024, Last Modified: 28 Nov 2024NeurIPS BDU Workshop 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Preconditioned Crank-Nicolson, Bayesian Neural Networks, Wide Neural Networks, Neural Network Gaussian Process
Abstract: Bayesian Neural Networks represent a fascinating confluence of deep learning techniques and probabilistic reasoning, offering a compelling framework for understanding uncertainty in complex predictive models. In this paper, we consider Bayesian Neural Networks with Gaussian initialization and we investigate the use of the preconditioned Crank-Nicolson algorithm to sample from the reparametrized posterior distribution of the weights as the width of the network grows. In addition to being robust in the infinite-dimensional setting, we prove that the acceptance probability of the preconditioned Crank-Nicolson sampler approaches 1 as the width of the network goes to infinity, independently of any stepsize tuning. We then compare how the efficiency of the Langevin Monte Carlo, the preconditioned Crank-Nicolson and the preconditioned Crank-Nicolson Langevin samplers are influenced by changes in the network width in some real-world cases. In particular, we demonstrate that in wide Bayesian Neural Networks configurations, the proposed method allows for more efficient sampling, as evidenced by a higher effective sample size and improved diagnostic results compared with the Langevin Monte Carlo algorithm.
Submission Number: 22
Loading