In this paper, we introduce a proximal-proximal majorization-minimization (PPMM) algorithm for nonconvex tuning-free robust regression problems. The basic idea is to apply the proximal majorization-minimization algorithm to solve the nonconvex problem with the inner subproblems solved by a sparse semismooth Newton (SSN) method based proximal point algorithm (PPA). We must emphasize that the main difficulty in the design of the algorithm lies in how to overcome the singular difficulty of the inner subproblem. Furthermore, we also prove that the PPMM algorithm converges to a d-stationary point. Due to the Kurdyka-Lojasiewicz (KL) property of the problem, we present the convergence rate of the PPMM algorithm. Numerical experiments demonstrate that our proposed algorithm outperforms the existing state-of-the-art algorithms.
翻译:在本文中, 我们引入了一种对非convex调制无强力回归问题的准- 准- 准主要- 最小化算法( PPMM ) 。 基本的想法是应用准主要- 最小化算法来解决以稀疏的半摩特 牛顿( SSN) 方法为基础的准点算法( PPA) 解决的内在子问题。 我们必须强调, 算法设计的主要困难在于如何克服内部子问题的独特困难。 此外, 我们还证明, PPMM 算法与 d- 静止点相融合。 由于问题的 Kurdyka- Lojasiewicz ( KL) 特性, 我们介绍了 PMM 算法的趋同率。 数值实验表明, 我们提议的算法超过了现有最先进的算法。