The accelerated proximal point algorithm (APPA), also known as "Catalyst", is a well-established reduction from convex optimization to approximate proximal point computation (i.e., regularized minimization). This reduction is conceptually elegant and yields strong convergence rate guarantees. However, these rates feature an extraneous logarithmic term arising from the need to compute each proximal point to high accuracy. In this work, we propose a novel Relaxed Error Criterion for Accelerated Proximal Point (RECAPP) that eliminates the need for high accuracy subproblem solutions. We apply RECAPP to two canonical problems: finite-sum and max-structured minimization. For finite-sum problems, we match the best known complexity, previously obtained by carefully-designed problem-specific algorithms. For minimizing $\max_y f(x,y)$ where $f$ is convex in $x$ and strongly-concave in $y$, we improve on the best known (Catalyst-based) bound by a logarithmic factor.
翻译:加速准点算法(APPA)也称为“数据分析法 ” ( Catalyst ), 是一种由精密优化到近近点计算(即常规化最小化)的明显减少。 这种减少在概念上是优雅的,并产生强烈的趋同率保证。 但是,由于需要计算每个准点到高精确度,这些比率具有一种不相干的对的术语。 在这项工作中,我们提议为加速准点(RECAPP)提出一种新的放松错误标准,从而消除了对高精度次问题解决方案的需求。 我们对两个卡纳问题应用REAPP:有限总和最大结构最小化。对于限定总和最小化,我们匹配了已知的最复杂程度,即以前通过仔细设计的问题特定算法获得的复杂程度。为了最大限度地减少美元(max_y f) f(x) y) 美元,而美元是以美元为convex 和强凝固度($),我们改进了已知的对数系数(Cataly-basite)的约束。