In this paper, we propose a clean and general proof framework to establish the convergence analysis of the Difference-of-Convex (DC) programming algorithm (DCA) for both standard DC program and convex constrained DC program. We first discuss suitable assumptions for the well-definiteness of DCA. Then, we focus on the convergence analysis of DCA, in particular, the global convergence of the sequence $\{x^k\}$ generated by DCA under the Lojasiewicz subgradient inequality and the Kurdyka-Lojasiewicz property respectively. Moreover, the convergence rate for the sequences $\{f(x^k)\}$ and $\{\|x^k-x^*\|\}$ are also investigated. We hope that the proof framework presented in this article will be a useful tool to conveniently establish the convergence analysis for many variants of DCA and new DCA-type algorithms.
翻译:在本文中,我们提出了一个干净和一般性的证明框架,以建立对标准DC方案和受Convex制约DC方案的Confreat-Convex(DC)编程算法(DCA)的趋同分析;我们首先讨论对DCA的确定性的适当假设;然后我们着重对DCA的趋同分析,特别是DCA在Lojasiewicz次梯度不平等和Kurdyka-Lojasiewicz 属性下产生的美元顺序的全球趋同;此外,对美元(x)k) $ 和 美元(x) k-x ⁇ 美元) 和 美元序列的趋同率也进行了调查;我们希望,这篇文章中提出的证明框架将成为一个有用的工具,方便地为DCA和新的DCA型算法的许多变体建立趋同分析。