Posterior distribution analysis for Bayesian inference in neural networks

Published: 09 Dec 2016, Last Modified: 05 Oct 2024NeurIPS 2016 Bayesian Deep Learning WorkshopEveryoneCC BY 4.0
Abstract: This study explores the posterior predictive distributions obtained with various Bayesian inference methods for neural networks. The quality of the distributions is assessed both visually and quantitatively using Kullback–Leibler (KL) divergence, Kolmogorov–Smirnov (KS) distance and precision-recall scores. We perform the analysis using a synthetic dataset that allows for a more detailed examination of the methods, and validate the findings on larger datasets. We find that among the recently proposed techniques, the simpler ones – Stochastic Gradient Langevin Dynamics (SGLD) and MC Dropout – are able to consistently provide good approximations to the “true” posterior, at the same time not requiring extensive tuning of parameters.
Loading