Structured Partial Stochasticity in Bayesian Neural Networks

Published: 27 May 2024, Last Modified: 02 Jul 2024AABI 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian neural networks, Bayesian deep learning, weight permutation symmetry, partial stochasticity, variational inference, approximate inference
Abstract: Bayesian neural network posterior distributions have a great number of modes that correspond to the same network function. The abundance of such modes can make it difficult for approximate inference methods to do their job. Recent work has demonstrated the benefits of partial stochasticity for approximate inference in Bayesian neural networks; inference can be less costly and performance can sometimes be improved. I propose a structured way to select the deterministic subset of weights that removes neuron permutation symmetries, and therefore the corresponding redundant posterior modes. With a drastically simplified posterior distribution, the performance of existing approximate inference schemes is found to be greatly improved.
Submission Number: 8
Loading