For interpolating kernel machines, minimizing the norm of the ERM solution minimizes stabilityDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: Stability, Linear Regression, Kernel Regression, Cross Validation Leave One Out Stability, Minimum norm solutions, Interpolation, Double Descent
Abstract: We study the average CV Leave One Out stability of kernel ridge-less regression and derive corresponding risk bounds. We show that the interpolating solution with minimum norm minimizes a bound on CV Leave One Out stability, which in turn is controlled by the condition number of the empirical kernel matrix. The latter can be characterized in the asymptotic regime where both the dimension and cardinality of the data go to infinity. Under the assumption of random kernel matrices, the corresponding test error should be expected to follow a double descent curve.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=x6m6Y3jN0o
12 Replies

Loading