We consider the problem of finding a saddle point for the convex-concave objective $\min_x \max_y f(x) + \langle Ax, y\rangle - g^*(y)$, where $f$ is a convex function with locally Lipschitz gradient and $g$ is convex and possibly non-smooth. We propose an adaptive version of the Condat-V\~u algorithm, which alternates between primal gradient steps and dual proximal steps. The method achieves stepsize adaptivity through a simple rule involving $\|A\|$ and the norm of recently computed gradients of $f$. Under standard assumptions, we prove an $\mathcal{O}(k^{-1})$ ergodic convergence rate. Furthermore, when $f$ is also locally strongly convex and $A$ has full row rank we show that our method converges with a linear rate. Numerical experiments are provided for illustrating the practical performance of the algorithm.
翻译:我们考虑了为 convex- concave 目标 $\ min_x\ max_y f(x) +\ langle Ax, y\rangle - g ⁇ (y) $, 美元是本地Lipschitz 梯度和 $g 的 convex 函数, 美元是convex, 也可能是非 smooth 。 我们建议了Condat- V ⁇ u 算法的适应性版本, 它将在原始梯度步骤和双成熟步骤之间相交替。 该方法通过一个简单规则实现适应性逐步化, 该规则涉及$_A_ $, 以及最近计算梯度的标准为$f$。 在标准假设下, 我们证明 $\ mathcal{O} (k ⁇ -1} egggodic un unc uncilation commation 。 此外, 当美元也是本地强烈的 convex 和 $A 具有全行等级时, 我们证明我们的方法与线性比率一致。 。 。 提供数值实验, 用于说明算算法的实际表现 。