In statistics, the least absolute shrinkage and selection operator (Lasso) is a regression method that performs both variable selection and regularization. There is a lot of literature available, discussing the statistical properties of the regression coefficients estimated by the Lasso method. However, there lacks a comprehensive review discussing the algorithms to solve the optimization problem in Lasso. In this review, we summarize five representative algorithms to optimize the objective function in Lasso, including the iterative shrinkage threshold algorithm (ISTA), fast iterative shrinkage-thresholding algorithms (FISTA), coordinate gradient descent algorithm (CGDA), smooth L1 algorithm (SLA), and path following algorithm (PFA). Additionally, we also compare their convergence rate, as well as their potential strengths and weakness.
翻译:在统计中,最不绝对的缩缩和筛选操作员(Lasso)是一种既能进行变量选择又能进行正规化的回归法。有许多文献,讨论Lasso方法估计的回归系数的统计特性。然而,缺乏一个全面的审查,讨论解决Lasso优化问题的算法。在本次审查中,我们总结了五种代表算法,以优化Lasso的客观功能,包括迭代缩缩缩线阀算法(ISTA ) 、 快速迭代缩缩算法(FISTA ) 、 协调梯度下沉算法(CGDA ) 、 平稳的L1算法(SLA ) 和 遵循算法(PFA ) 的路径。 此外,我们还比较了它们汇合率及其潜在强弱。</s>