Keywords: ridge regression, sketching, random matrix theory, cross-validation, high-dimensional asymptotics
TL;DR: We study the structure of ridge regression in a high-dimensional asymptotic framework, and get insights about cross-validation and sketching.
Abstract: We study the following three fundamental problems about ridge regression: (1) what is the structure of the estimator? (2) how to correctly use cross-validation to choose the regularization parameter? and (3) how to accelerate computation without losing too much accuracy? We consider the three problems in a unified large-data linear model. We give a precise representation of ridge regression as a covariance matrix-dependent linear combination of the true parameter and the noise.
We study the bias of $K$-fold cross-validation for choosing the regularization parameter, and propose a simple bias-correction. We analyze the accuracy of primal and dual sketching for ridge regression, showing they are surprisingly accurate. Our results are illustrated by simulations and by analyzing empirical data.
Code: https://github.com/liusf15/RidgeRegression
Original Pdf: pdf
5 Replies
Loading