This paper introduces a general framework for iterative optimization algorithms and establishes under general assumptions that their convergence is asymptotically geometric. We also prove that under appropriate assumptions, the rate of convergence can be lower bounded. The convergence is then only geometric, and we provide the exact asymptotic convergence rate. This framework allows to deal with constrained optimization and encompasses the Expectation Maximization algorithm and the mirror descent algorithm, as well as some variants such as the alpha-Expectation Maximization or the Mirror Prox algorithm.Furthermore, we establish sufficient conditions for the convergence of the Mirror Prox algorithm, under which the method converges systematically to the unique minimizer of a convex function on a convex compact set.
翻译:本文为迭接优化算法引入了一个总体框架,并在一般假设下规定,它们的趋同是零星几何分法。 我们还证明,在适当的假设下,趋同率可以降低界限。这样,趋同率只能是几何法,我们提供确切的非现性趋同率。这个框架可以处理有限的优化,包括预期最大化算法和反射下沉算法,以及一些变体,如阿尔法-预期最大化算法或镜像普罗克斯算法。此外,我们为镜像普罗克斯算法的趋同建立了充分的条件,根据这种算法,该方法可以系统地与在 convex 契约集上独有的康韦克斯函数最小化器汇合在一起。</s>