In this paper we study a convex-concave saddle-point problem $\min_x\max_y f(x) + y^\top\mathbf{A} x - g(y)$, where $f(x)$ and $g(y)$ are smooth and convex functions. We propose an Accelerated Primal-Dual Gradient Method for solving this problem which (i) achieves an optimal linear convergence rate in the strongly-convex-strongly-concave regime matching the lower complexity bound (Zhang et al., 2021) and (ii) achieves an accelerated linear convergence rate in the case when only one of the functions $f(x)$ and $g(y)$ is strongly convex or even none of them are. Finally, we obtain a linearly-convergent algorithm for the general smooth and convex-concave saddle point problem $\min_x\max_y F(x,y)$ without requirement of strong convexity or strong concavity.
翻译:在本文中,我们研究一个共振-共振马鞍问题$\min_x\max_y f(x)+ y ⁇ t\mathb{A}x-g(y)$,其中美元(x)美元和美元(y)是平滑的,而美元和美元(y)是共振的函数。我们建议采用一种加速的纯度-双倍梯度法来解决这一问题,即(一)在强力组合-强力组合式马鞍问题中达到符合较低复杂约束的优化线性趋同率(Zhang等人,2021年)和(二)在只有一种功能(f(x)美元)和美元(y)是强烈的正弦或甚至没有这种功能的情况下达到加速线性线性趋同率。最后,我们为一般的光度和锥形-共振马鞍点问题获得了一条线性趋同值算法,$\min_x\max_y_y),而不需要强大的粘度或坚固的硬度问题。