PAC-Bayesian Neural Network BoundsDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We derive a new PAC-Bayesian Bound for unbounded loss functions (e.g. Negative Log-Likelihood).
Abstract: Bayesian neural networks, which both use the negative log-likelihood loss function and average their predictions using a learned posterior over the parameters, have been used successfully across many scientific fields, partly due to their ability to `effortlessly' extract desired representations from many large-scale datasets. However, generalization bounds for this setting is still missing. In this paper, we present a new PAC-Bayesian generalization bound for the negative log-likelihood loss which utilizes the \emph{Herbst Argument} for the log-Sobolev inequality to bound the moment generating function of the learners risk.
Keywords: PAC-Bayesian bounds, PAC-Bayes, Generalization bounds, Bayesian inference
Original Pdf: pdf
9 Replies

Loading