Non-Asymptotic PAC-Bayes Bounds on Generalisation ErrorDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: PAC-Bayes Bounds, Large Deviation Theory, Concentration Inequalities, Generalisation Error
Abstract: Constructing non-vacuous PAC-Bayes bounds on generalization errors for un- bounded risk functionals, especially in the non-asymptotic regime, is an active area of research. However, current state of the art results are applicable only in some very specialized cases. In this work, we give an integrability condition which exactly characterizes when any risk functional, for a given data set and model space, admits such bounds using the Levy-Khintchine theorem. Further, we de- rive a Bahadur-Rao type exact asymptotic bound, which is much sharper than a traditional Chernoff type inequality, especially in the under-sampled regime. These bounds give us the flexibility to construct data or model-dependent consistency promoting updates to a data-free prior, which provably improves the generalization performance.
One-sentence Summary: Constructing non-vacuous data model/dependent PAC-Bayes bounds on generalization errors for unbounded risk functionals, especially in the non-asymptotic regime
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=6W897Q3Uhw
11 Replies

Loading