The k-tied Normal Distribution: A Compact Parameterization of Gaussian Mean Field Posteriors in Bayesian Neural NetworksDownload PDF

16 Oct 2019 (modified: 06 Dec 2019)AABI 2019 Symposium Blind SubmissionReaders: Everyone
  • Keywords: variational Bayes, Bayesian neural networks, mean field
  • TL;DR: Mean field VB uses twice as many parameters; we tie variance parameters in mean field VB without any loss in ELBO, gaining speed and lower variance gradients.
  • Abstract: Variational Bayesian Inference is a popular methodology for approximating posterior distributions over Bayesian neural network weights. Recent work developing this class of methods has explored ever richer parameterizations of the approximate posterior in the hope of improving performance. In contrast, here we share a curious experimental finding that suggests instead restricting the variational distribution to a more compact parameterization. For a variety of deep Bayesian neural networks trained using Gaussian mean-field variational inference, we find that the posterior standard deviations consistently exhibit strong low-rank structure after convergence. This means that by decomposing these variational parameters into a low-rank factorization, we can make our variational approximation more compact without decreasing the models' performance. Furthermore, we find that such factorized parameterizations improve the signal-to-noise ratio of stochastic gradient estimates of the variational lower bound, resulting in faster convergence.
0 Replies