Data augmentation in Bayesian neural networks and the cold posterior effectDownload PDF

Published: 20 May 2022, Last Modified: 05 May 2023UAI 2022 PosterReaders: Everyone
Keywords: Bayesian Deep Learning, Data Augmentation, Bayesian Neural Networks, Invariant Neural Networks
TL;DR: We propose a principled method to incorporate data augmentation into BNN training, but find that the cold posterior effect persists.
Abstract: Bayesian neural networks that incorporate data augmentation implicitly use a "randomly perturbed log-likelihood [which] does not have a clean interpretation as a valid likelihood function" (Izmailov et al. 2021). Here, we provide several approaches to developing principled Bayesian neural networks incorporating data augmentation. We introduce a "finite orbit" setting which allows valid likelihoods to be computed exactly, and for the more usual "full orbit" setting we derive multi-sample bounds tighter than those used previously for Bayesian neural networks with data augmentation. These models cast light on the origin of the cold posterior effect. In particular, we find that the cold posterior effect persists even in these principled models incorporating data augmentation. This suggests that the cold posterior effect cannot be dismissed as an artifact of data augmentation using incorrect likelihoods.
Supplementary Material: zip
5 Replies

Loading