The Price of Robustness: Stable Classifiers Need Overparameterization

Published: 05 Nov 2025, Last Modified: 05 Nov 2025NLDL 2026 AbstractsEveryoneRevisionsBibTeXCC BY 4.0
Keywords: concentration inequalities, isoperimetry, robustness, stability, classification problems, generalization, overparameterization
TL;DR: We show that interpolating classifiers can only be stable, and thus generalize well, if they are sufficiently overparameterized.
Abstract: The link between overparameterization, robustness, and generalization in discontinuous classifiers remains unclear. We establish generalization bounds that tighten with _class stability_ - the expected distance to the decision boundary - yielding a _law of robustness for classification_ that extends prior smoothness based settings. As a consequence, any interpolating model with $p \approx n$ parameters is necessarily _unstable_, implying that robust generalization requires overparameterization. For infinite function classes, we obtain analogous results through a stronger robustness measure, the _normalized co-stability_, defined via output margins. Empirical results support our theory: stability grows with model size and aligns closely with test performance.
Serve As Reviewer: ~Jonas_von_Berg1
Submission Number: 34
Loading