$\boldsymbol {X}\boldsymbol {\beta}=\boldsymbol {y} $ by iteratively refining the solution estimate; the former uses random rows of $\boldsymbol {X} ${to update $\boldsymbol {\beta} $ given the corresponding equations} and the latter uses random columns of $\boldsymbol {X} ${to update corresponding coordinates in $\boldsymbol {\beta} $}. Interest in these methods was recently revitalized by a proof of Strohmer and Vershynin showing linear …
The Kaczmarz and Gauss-Seidel methods aim to solve a linear system by iteratively refining the solution estimate; the former uses random rows of {to update given the corresponding equations} and the latter uses random columns of {to update corresponding coordinates in }. Interest in these methods was recently revitalized by a proof of Strohmer and Vershynin showing linear convergence in expectation for a \textit{randomized} Kaczmarz method variant (RK), and a similar result for the randomized Gauss-Seidel algorithm (RGS) was later proved by Lewis and Leventhal. Recent work unified the analysis of these algorithms for the overcomplete and undercomplete systems, showing convergence to the ordinary least squares (OLS) solution and the minimum Euclidean norm solution respectively. This paper considers the natural follow-up to the OLS problem, ridge regression, which solves . We present particular variants of RK and RGS for solving this system and derive their convergence rates. We compare these to a recent proposal by Ivanov and Zhdanov to solve this system, that can be interpreted as randomly sampling both rows and columns, which we argue is often suboptimal. Instead, we claim that one should always use RGS (columns) when and RK (rows) when . This difference in behavior is simply related to the minimum eigenvalue of two related positive semidefinite matrices, and when or .