We study first-order methods for constrained min-max optimization. Existing methods either requires two gradient calls or two projections in each iteration, which may be costly in applications. In this paper, we first show that the Optimistic Gradient (OG) method, a single-call single-projection algorithm, has $O(\frac{1}{\sqrt{T}})$ convergence rate for inclusion problems with operators that satisfy the weak Minty variation inequality (MVI). Our second result is the first single-call single-projection algorithm -- the Accelerated Reflected Gradient (ARG) method that achieves the optimal $O(\frac{1}{T})$ convergence rate for inclusion problems that satisfy negative comonotonicity. Both the weak MVI and negative comonotonicity are well-studied assumptions and capture a rich set of non-convex non-concave min-max optimization problems. Finally, we show that the Reflected Gradient (RG) method, another single-call single-projection algorithm, has $O(\frac{1}{\sqrt{T}})$ last-iterate convergence rate for constrained convex-concave min-max optimization, answering an open problem of [Hsieh et al, 2019].
翻译:我们研究限制最小值优化的第一阶方法。 现有方法要么需要两次梯度调用, 要么需要对每迭代中可能费用高昂的两种梯度调用或两次预测。 在本文中, 我们首先显示, 最佳渐变( OG) 方法( 单调单投影算法 ) 具有 $( frac{ 1unsqrt{T ⁇ } ) 和 满足微小变差不平等的操作者( MVI) 的融合问题 。 我们的第二个结果就是第一个单调单投影单投影算法( 加速反向渐进( ARG) 方法), 能达到最佳的 $( frac{ 1\\\\\\\\\\\\\\\\\\\ t} ( ) $( Og) 的融合率, 满足负负共振动算法, 和负共振动共振度是精心研究的假设, 并捕捉到大量非 convexn- noconxn- min- minalmaxal- slageal- demaxlistal