Probabilistic Deep Learning with Generalised Variational InferenceDownload PDF

Published: 29 Jan 2022, Last Modified: 05 May 2023AABI 2022 PosterReaders: Everyone
Keywords: Probabilistic Deep Learning, Bayesian Neural Networks, Bayesian Inference, Variational Inference, Generalised Variational Inference
Abstract: We study probabilistic Deep Learning methods through the lens of Approximate Bayesian Inference. In particular, we examine Bayesian Neural Networks (BNNs), which usually suffer from multiple ill-posed assumptions such as prior and likelihood misspecification. In this direction, we investigate a recently proposed approximate inference framework called Generalised Variational Inference (GVI) in comparison to state-of-the-art methods including standard Variational Inference, Monte-Carlo Dropout, Stochastic gradient Langevin dynamics and Deep Ensembles. Also, we expand the original research around GVI by exploring a broader set of model architectures and mathematical settings on both real and synthetic data. Our experiments demonstrate that approximate posterior distributions derived from such a method offer attractive properties with respect to uncertainty quantification, prior specification robustness and predictive performance, especially in the case of BNNs.
1 Reply

Loading