Generalization in the Face of Adaptivity: A Bayesian Perspective

Published: 21 Sept 2023, Last Modified: 04 Jan 2024NeurIPS 2023 spotlightEveryoneRevisionsBibTeX
Keywords: Differential Privacy, Adaptive Data Analysis
TL;DR: We provide generalization guarantees under adaptivity that scale with the variance, rather than the range
Abstract: Repeated use of a data sample via adaptively chosen queries can rapidly lead to overfitting, wherein the empirical evaluation of queries on the sample significantly deviates from their mean with respect to the underlying data distribution. It turns out that simple noise addition algorithms suffice to prevent this issue, and differential privacy-based analysis of these algorithms shows that they can handle an asymptotically optimal number of queries. However, differential privacy's worst-case nature entails scaling such noise to the range of the queries even for highly-concentrated queries, or introducing more complex algorithms. In this paper, we prove that straightforward noise-addition algorithms already provide variance-dependent guarantees that also extend to unbounded queries. This improvement stems from a novel characterization that illuminates the core problem of adaptive data analysis. We show that the harm of adaptivity results from the covariance between the new query and a Bayes factor-based measure of how much information about the data sample was encoded in the responses given to past queries. We then leverage this characterization to introduce a new data-dependent stability notion that can bound this covariance.
Supplementary Material: pdf
Submission Number: 11598
Loading