The monotone variational inequality is a central problem in mathematical programming that unifies and generalizes many important settings such as smooth convex optimization, two-player zero-sum games, convex-concave saddle point problems, etc. The extragradient algorithm by Korpelevich [1976] and the optimistic gradient descent-ascent algorithm by Popov [1980] are arguably the two most classical and popular methods for solving monotone variational inequalities. Despite its long history, the following major problem remains open. What is the last-iterate convergence rate of the extragradient algorithm or the optimistic gradient descent-ascent algorithm for monotone and Lipschitz variational inequalities with constraints? We resolve this open problem by showing that both the extragradient algorithm and the optimistic gradient descent-ascent algorithm have a tight $O\left(\frac{1}{\sqrt{T}}\right)$ last-iterate convergence rate for arbitrary convex feasible sets, which matches the lower bound by Golowich et al. [2020a, b]. Our rate is measured in terms of the standard gap function. At the core of our results lies a new performance measure -- the tangent residual, which can be viewed as an adaptation of the norm of the operator that takes the local constraints into account. We use the tangent residual (or a slight variation of the tangent residual) as the performance measure in our analysis of the extragradient algorithm (or the optimistic gradient descent-ascent algorithm). To establish the monotonicity of these performance measures, we develop a new approach that combines the power of the sum-of-squares programming with the low dimensionality of the update rule of the extragradient or the optimistic gradient descent-ascent algorithm. We believe our approach has many additional applications in the analysis of iterative methods.
翻译:单质变异性是数学编程中的一个中心问题。 单质变异性是数学编程中的一个中心问题, 它统一和概括了许多重要设置, 如平滑的曲线优化、 双球零和游戏、 康维克斯- 康康马鞍问题等。 Korpelevich [1976年] 的超升级算法和Popov [1980年] 的乐观梯度梯度递增算法可以说是解决单质变异性的两个最经典和最受欢迎的方法。 尽管它的历史悠久, 下列主要问题仍然未解决。 超级算法或对单调和利普西茨的乐观梯度增升算算法的最晚的趋同速度趋同率是多少, 单调法和利普西茨利特的偏向升算法是多少。 我们的比以标准差值计算方法测量了单质差值, 我们的伸缩性演法的演算结果是, 我们的伸缩性演算法的演算结果是, 我们的伸缩性演算法的演算结果是一个新的。 我们的伸缩性演算法的演算结果, 我们的伸缩的演算的演算的演算的演算的演算结果是新的。