PAC-Bayes-Chernoff bounds for unbounded losses

Published: 25 Sept 2024, Last Modified: 06 Nov 2024NeurIPS 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Statistical learning theory, PAC-Bayes, Chernoff bounds, regularization
TL;DR: We introduce a new PAC-Bayes bound that allows working with richer assumptions and illustrate its potential by generalizing previous bounds, obtaining novel ones for several regularization techniques, and minimizing them to get new posteriors
Abstract:

We introduce a new PAC-Bayes oracle bound for unbounded losses that extends Cramér-Chernoff bounds to the PAC-Bayesian setting. The proof technique relies on controlling the tails of certain random variables involving the Cramér transform of the loss. Our approach naturally leverages properties of Cramér-Chernoff bounds, such as exact optimization of the free parameter in many PAC-Bayes bounds. We highlight several applications of the main theorem. Firstly, we show that our bound recovers and generalizes previous results. Additionally, our approach allows working with richer assumptions that result in more informative and potentially tighter bounds. In this direction, we provide a general bound under a new model-dependent assumption from which we obtain bounds based on parameter norms and log-Sobolev inequalities. Notably, many of these bounds can be minimized to obtain distributions beyond the Gibbs posterior and provide novel theoretical coverage to existing regularization techniques.

Supplementary Material: zip
Primary Area: Learning theory
Submission Number: 9608
Loading