Lasso and Ridge are important minimization problems in machine learning and statistics. They are versions of linear regression with squared loss where the vector $\theta\in\mathbb{R}^d$ of coefficients is constrained in either $\ell_1$-norm (for Lasso) or in $\ell_2$-norm (for Ridge). We study the complexity of quantum algorithms for finding $\varepsilon$-minimizers for these minimization problems. We show that for Lasso we can get a quadratic quantum speedup in terms of $d$ by speeding up the cost-per-iteration of the Frank-Wolfe algorithm, while for Ridge the best quantum algorithms are linear in $d$, as are the best classical algorithms.
翻译:Lasso 和 Ridge 是机器学习和统计中重要的最小化问题。 它们是线性回归和正方损失的版本, 其矢量 $\theta\ in\ mathbb{R ⁇ d$ 系数的值受限制, 以美元/ 美元/ 美元/ 美元/ 诺姆( 以Lasso) 或 美元/ 2美元/ 美元/ 诺姆( 以Ridge) 。 我们研究量子算法的复杂性, 以寻找 $/ varepsilon$- minminers 来最小化这些问题。 我们显示, 对于 Laso 来说, 通过加速弗兰克- Wolfe 算法的成本- 永久化, 我们可以通过加速 Frank- Wolfe 算法的成本- 实现四边量量量量计算加速, 而对于 Ridge 的算法是线性以美元/ 美元/ 和 最经典算法一样, 。