We give a sketching-based iterative algorithm that computes a $1+\varepsilon$ approximate solution for the ridge regression problem $\min_x \|Ax-b\|_2^2 +\lambda\|x\|_2^2$ where $A \in R^{n \times d}$ with $d \ge n$. Our algorithm, for a constant number of iterations (requiring a constant number of passes over the input), improves upon earlier work (Chowdhury et al.) by requiring that the sketching matrix only has a weaker Approximate Matrix Multiplication (AMM) guarantee that depends on $\varepsilon$, along with a constant subspace embedding guarantee. The earlier work instead requires that the sketching matrix has a subspace embedding guarantee that depends on $\varepsilon$. For example, to produce a $1+\varepsilon$ approximate solution in $1$ iteration, which requires $2$ passes over the input, our algorithm requires the OSNAP embedding to have $m= O(n\sigma^2/\lambda\varepsilon)$ rows with a sparsity parameter $s = O(\log(n))$, whereas the earlier algorithm of Chowdhury et al. with the same number of rows of OSNAP requires a sparsity $s = O(\sqrt{\sigma^2/\lambda\varepsilon} \cdot \log(n))$, where $\sigma = \opnorm{A}$ is the spectral norm of the matrix $A$. We also show that this algorithm can be used to give faster algorithms for kernel ridge regression. Finally, we show that the sketch size required for our algorithm is essentially optimal for a natural framework of algorithms for ridge regression by proving lower bounds on oblivious sketching matrices for AMM. The sketch size lower bounds for AMM may be of independent interest.
翻译:我们给出一个基于素描的迭代算法, 计算一个1美元瓦雷普西隆的近似解决方案, 用于山脊回归问题 $\ min_ x $Ax- b ⁇ 2 ⁇ 2 ⁇ 2 ⁇ 2 ⁇ lambda ⁇ x ⁇ 2 ⁇ 2 ⁇ 2美元, 其中美元A=n R ⁇ n\ time d}$d 美元美元。 我们的算法, 对于一个固定的迭代数( 需要输入一个固定的传译次数), 改进早先的工作( cowdhury et al.), 要求草图矩阵只拥有一个较弱的Aprobl 缩缩缩缩缩数 (AMM) 的保证, 而早期的工作则要求草图矩阵有一个子空间嵌入保证, $qureqr=alepal limals a masserational exeration (我们需要2n\\\\\\\\\\\\\\ diral direx li) a mass lial deal deal deal deal deal demoal deal smax slations exmission a.