Lasso and Ridge are important minimization problems in machine learning and statistics. They are versions of linear regression with squared loss where the vector $\theta\in\mathbb{R}^d$ of coefficients is constrained in either $\ell_1$-norm (for Lasso) or in $\ell_2$-norm (for Ridge). We study the complexity of quantum algorithms for finding $\varepsilon$-minimizers for these minimization problems. We show that for Lasso we can get a quadratic quantum speedup in terms of $d$ by speeding up the cost-per-iteration of the Frank-Wolfe algorithm, while for Ridge the best quantum algorithms are linear in $d$, as are the best classical algorithms. As a byproduct of our quantum lower bound for Lasso, we also prove the first classical lower bound for Lasso that is tight up to polylog-factors.
翻译:Lasso 和 Ridge 是机器学习和统计中重要的最小化问题。 它们是线性回归的版本, 其损失为平方, 矢量 $\theta\ in\ mathbb{R ⁇ d$ 系数的值受限制在 $@ $_ $- norm (对于 Lasso) 或 $\ ell_ 2$- norm (对于 Ridge ) 。 我们研究量子算法的复杂性, 以寻找 $\ varepsilon$- minizeers 来最小化这些问题。 对于 Lasso 来说, 我们显示通过加速 Frank- Wolfe 算法的成本- / 百分比化, 我们可以得到四面量子算法的最佳量子算法以 美元为线性值, 和 最好的经典算法一样以 $ 为线性 。 作为我们拉索 量子 较低 的副产物, 我们还证明了Lasso 的经典算法的首个低端线, 与 接近于 多边算法 。