Abstract: The Kaczmarz and Gauss--Seidel methods aim to solve an $m \times n$ linear system $X{\beta} = {y}$ by iteratively refining the solution estimate; the former uses random rows of $X$ to update ${\beta}$ given the corresponding equations and the latter uses random columns of $X$ to update corresponding coordinates in ${\beta}$. Recent work analyzed these algorithms in a parallel comparison for the overcomplete and undercomplete systems, showing convergence to the ordinary least squares (OLS) solution and the minimum Euclidean norm solution, respectively. This paper considers the natural follow-up to the OLS problem---ridge or Tikhonov regularized regression. By viewing them as variants of randomized coordinate descent, we present variants of the randomized Kaczmarz (RK) and randomized Gauss--Siedel (RGS) for solving this system and derive their convergence rates. We prove that a recent proposal, which can be interpreted as randomizing over both rows and columns, is strictly suboptimal---instead, one should always work with randomly selected columns (RGS) when $m > n$ (\#rows $>$ \#cols) and with randomly selected rows (RK) when $ n > m$ (\#cols $>$ \#rows).
0 Replies
Loading