An Empirical Analysis of the Advantages of Finite v.s. Infinite Width Bayesian Neural NetworksDownload PDF

Published: 06 Dec 2022, Last Modified: 10 Nov 2024ICBINB posterReaders: Everyone
Keywords: Bayesian Deep Learning, NNGP, BNNs
TL;DR: An Empirical Analysis of the Advantages of Finite v.s. Infinite Width Bayesian Neural Networks
Abstract: Comparing Bayesian neural networks (BNNs) with different widths is challenging because, as the width increases, multiple model properties change simultaneously, and, inference in the finite-width case is intractable. In this work, we empirically compare finite- and infinite-width BNNs, and provide quantitative and qualitative explanations for their performance difference. We find that when the model is mis-specified, increasing width can hurt BNN performance. In these cases, we provide evidence that finite-width BNNs generalize better partially due to the properties of their frequency spectrum that allows them to adapt under model mismatch.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/an-empirical-analysis-of-the-advantages-of/code)
0 Replies

Loading