This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex $\ell_1$ and the nonconvex $\ell_1-\ell_2$ functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.
翻译:本文采用了非线性共振梯度适应性动力的概念,以加速稀疏恢复中的优化问题。 具体地说, 我们考虑两种类型的最小化问题:一种( 单) 不同的功能和一种非移动函数和一种不同函数的总和。 在第一种情况下, 我们采用固定的步数, 以避免传统的线条搜索, 并对四方问题的拟议算法进行趋同分析。 这种加速进一步结合了操作员分离技术, 以处理第二种情况下的非移动函数 。 我们用 convex $\ ell_ 1 $ 美元和非convex $\ ell_ 1\ ell_ 2 美元功能作为两个案例研究, 以证明所提议的方法对传统方法的效率 。