This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex $\ell_1$ and the nonconvex $\ell_1-\ell_2$ functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.
翻译:本文将自适应动量思想应用于非线性共轭梯度,以加速稀疏恢复中的优化问题。具体而言,我们考虑两种类型的最小化问题:可微函数的问题和非光滑函数和可微函数的求和问题。对于第一种情况,我们采用固定步长来避免传统的线性搜索,并对二次问题的提出算法进行了收敛分析。在第二种情况下,我们进一步将此加速方法与算子分裂技术相结合,处理非光滑函数。我们使用凸的 $\ell_1$ 和非凸的 $\ell_1-\ell_2$ 函数作为两个案例研究,以展示所提出方法在效率上的优势。