Why Cold Posteriors? On the Suboptimal Generalization of Optimal Bayes EstimatesDownload PDF

Nov 23, 2020 (edited Jan 10, 2021)AABI2020Readers: Everyone
  • Keywords: Cold posterior, Bayesian deep learning, Neural networks
  • TL;DR: We explore several possible reasons for the cold posterior effect.
  • Abstract: Recent works have shown that the predictive accuracy of Bayesian deep learning models exhibit substantial improvements when the posterior is raised to a 1/T power with T<1. In this work, we explore several possible reasons for this surprising behavior.
1 Reply