Minimum Norm Interpolation Meets The Local Theory of Banach Spaces

Published: 02 May 2024, Last Modified: 25 Jun 2024ICML 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Minimum-norm interpolators have recently gained attention primarily as an analyzable model to shed light on the double descent phenomenon observed for neural networks. The majority of the work has focused on analyzing interpolators in Hilbert spaces, where typically an effectively low-rank structure of the feature covariance prevents a large bias. More recently, tight vanishing bounds have also been shown for isotropic high-dimensional data for $\ell_p$-spaces with $p\in[1,2)$, leveraging sparse structure of the ground truth. However, these proofs are tailored to specific settings and hard to generalize. This paper takes a first step towards establishing a general framework that connects generalization properties of the interpolators to well-known concepts from high-dimensional geometry, specifically, from the local theory of Banach spaces. In particular, we show that under $2$-uniform convexity, the bias of the minimal norm solution is bounded by the Gaussian complexity of the class. We then prove a ``reverse'' Efron-Stein lower bound on the expected conditional variance of the minimal norm solution under cotype $2$. Finally, we prove that this bound is sharp for $\ell_p$-linear regression under sub-Gaussian covariates.
Submission Number: 6365
Loading