Toward Better PAC-Bayes Bounds for Uniformly Stable Algorithms

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: PAC-Bayesian Bounds, Uniform Stability, Generalization Analysis
TL;DR: We develop sharper generalization bounds for uniformly stable randomized algorithms in a PAC-Bayesian framework.
Abstract: We give sharper bounds for uniformly stable randomized algorithms in a PAC-Bayesian framework, which improve the existing results by up to a factor of $\sqrt{n}$ (ignoring a log factor), where $n$ is the sample size. The key idea is to bound the moment generating function of the generalization gap using concentration of weakly dependent random variables due to Bousquet et al (2020). We introduce an assumption of sub-exponential stability parameter, which allows a general treatment that we instantiate in two applications: stochastic gradient descent and randomized coordinate descent. Our results eliminate the requirement of strong convexity from previous results, and hold for non-smooth convex problems.
Supplementary Material: zip
Submission Number: 773
Loading