Why Cold Posteriors? On the Suboptimal Generalization of Optimal Bayes EstimatesDownload PDF

Published: 21 Dec 2020, Last Modified: 05 May 2023AABI2020Readers: Everyone
Keywords: Cold posterior, Bayesian deep learning, Neural networks
TL;DR: We explore several possible reasons for the cold posterior effect.
Abstract: Recent works have shown that the predictive accuracy of Bayesian deep learning models exhibit substantial improvements when the posterior is raised to a 1/T power with T<1. In this work, we explore several possible reasons for this surprising behavior.
1 Reply

Loading