Spike-and-Slab Probabilistic Backpropagation: When Smarter Approximations Make No DifferenceDownload PDF

Published: 06 Dec 2022, Last Modified: 05 May 2023ICBINB posterReaders: Everyone
Keywords: Bayesian, Probabilistic Backpropagation, Spike-and-Slab, Approximation
TL;DR: Gaussian approximations miss nuance in capturing individual message updates in probabilistic backpropagation, but more sophisticated approximations do not have a significant impact on overall performance.
Abstract: Probabilistic backpropagation is an approximate Bayesian inference method for deep neural networks, using a message-passing framework. These messages---which correspond to distributions arising as we propagate our input through a probabilistic neural network---are approximated as Gaussian. However, in practice, the exact distributions may be highly non-Gaussian. In this paper, we propose a more realistic approximation based on a spike-and-slab distribution. Unfortunately, in this case, better approximation of the messages does not translate to better downstream performance. We present results comparing the two schemes and discuss why we do not see a benefit from this spike-and-slab approach.
0 Replies

Loading