Computational Efficiency under Covariate Shift in Kernel Ridge Regression

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 spotlightEveryoneRevisionsBibTeXCC BY 4.0
Keywords: statistical learning theory, kernel methods, covariate shift, random projections
TL;DR: We show that kernel ridge regression under covariate shift can be made computationally efficient using Nyström random projections, achieving optimal statistical rates without sacrificing accuracy.
Abstract: This paper addresses the covariate shift problem in the context of nonparametric regression within reproducing kernel Hilbert spaces (RKHSs). Covariate shift arises in supervised learning when the input distributions of the training and test data differ, presenting additional challenges for learning. Although kernel methods have optimal statistical properties, their high computational demands in terms of time and, particularly, memory, limit their scalability to large datasets. To address this limitation, the main focus of this paper is to explore the trade-off between computational efficiency and statistical accuracy under covariate shift. We investigate the use of random projections where the hypothesis space consists of a random subspace within a given RKHS. Our results show that, even in the presence of covariate shift, significant computational savings can be achieved without compromising learning performance.
Primary Area: Theory (e.g., control theory, learning theory, algorithmic game theory)
Submission Number: 20436
Loading