Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: generalization, information-theoretic bounds, stability
TL;DR: We derive some sharper information-theoretic bounds by combing with the stability-based framework.
Abstract: We present new information-theoretic generalization guarantees through the a novel construction of the "neighboring-hypothesis" matrix and a new family of stability notions termed sample-conditioned hypothesis (SCH) stability. Our approach yields sharper bounds that improve upon previous information-theoretic bounds in various learning scenarios. Notably, these bounds address the limitations of existing information-theoretic bounds in the context of stochastic convex optimization (SCO) problems, as explored in the recent work by Haghifam et al. (2023).
Supplementary Material: pdf
Submission Number: 7709
Loading