Asymptotic Properties for Bayesian Neural Network in Besov SpaceDownload PDF

Published: 31 Oct 2022, Last Modified: 14 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: Bayesian neural network, sparsity, deep learning, posterior consistency, optimal rate
TL;DR: We propose a practical Bayesian neural network with guaranteed asymptotic properties
Abstract: Neural networks have shown great predictive power when applied to unstructured data such as images and natural languages. The Bayesian neural network captures the uncertainty of prediction by computing the posterior distribution of the model parameters. In this paper, we show that the Bayesian neural network with spikeand-slab prior has posterior consistency with a near minimax optimal convergence rate when the true regression function belongs to the Besov space. The spikeand-slab prior is adaptive to the smoothness of the regression function and the posterior convergence rate does not change even when the smoothness of the regression function is unknown. We also consider the shrinkage prior, which is computationally more feasible than the spike-and-slab prior, and show that it has the same posterior convergence rate as the spike-and-slab prior.
Supplementary Material: zip
14 Replies

Loading