Generalization Error Rates in Kernel Regression: The Crossover from the Noiseless to Noisy RegimeDownload PDF

21 May 2021, 20:46 (edited 26 Oct 2021)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: Statistical Physics, Kernel ridge regression, Teacher-Student, Replica method, High-dimensional statistics, Gaussian Design
  • TL;DR: We show that the error rates in kernel ridge regression crossover from "noiseless ones" to "noisy ones" depending on the number of samples
  • Abstract: In this manuscript we consider Kernel Ridge Regression (KRR) under the Gaussian design. Exponents for the decay of the excess generalization error of KRR have been reported in various works under the assumption of power-law decay of eigenvalues of the features co-variance. These decays were, however, provided for sizeably different setups, namely in the noiseless case with constant regularization and in the noisy optimally regularized case. Intermediary settings have been left substantially uncharted. In this work, we unify and extend this line of work, providing characterization of all regimes and excess error decay rates that can be observed in terms of the interplay of noise and regularization. In particular, we show the existence of a transition in the noisy setting between the noiseless exponents to its noisy values as the sample complexity is increased. Finally, we illustrate how this crossover can also be observed on real data sets.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code: https://github.com/IdePHICS/KernelRidgeCrossover
13 Replies

Loading