The monotone variational inequality is a central problem in mathematical programming that unifies and generalizes many important settings such as smooth convex optimization, two-player zero-sum games, convex-concave saddle point problems, etc. The extragradient algorithm by Korpelevich [1976] and the optimistic gradient descent-ascent algorithm by Popov [1980] are arguably the two most classical and popular methods for solving monotone variational inequalities. Despite their long histories, the following major problem remains open. What is the last-iterate convergence rate of the extragradient algorithm or the optimistic gradient descent-ascent algorithm for monotone and Lipschitz variational inequalities with constraints? We resolve this open problem by showing that both the extragradient algorithm and the optimistic gradient descent-ascent algorithm have a tight $O\left(\frac{1}{\sqrt{T}}\right)$ last-iterate convergence rate for arbitrary convex feasible sets, which matches the lower bound by Golowich et al. [2020a,b]. Our rate is measured in terms of the standard gap function. At the core of our results lies a non-standard performance measure -- the tangent residual, which can be viewed as an adaptation of the norm of the operator that takes the local constraints into account. We use the tangent residual (or a slight variation of the tangent residual) as the the potential function in our analysis of the extragradient algorithm (or the optimistic gradient descent-ascent algorithm) and prove that it is non-increasing between two consecutive iterates.
翻译:单质变异性是数学编程中的一个中心问题。 单质变异性是数学编程中的一个中心问题, 它统一和概括了许多重要设置, 如平滑的曲线优化、 双球零和游戏、 平面变形的马鞍问题等等。 Korpelevich [1976年] 的超升级算法和Popov [1980年] 的乐观梯度梯度递增算法可以说是解决单质变异性不平等的最经典和最受欢迎的两种方法。 尽管它们的历史悠久, 下列主要问题仍然未解决。 超级算法或单一调和利普西茨的乐观梯度增益算法最后的趋同率是多少? 我们用非梯度算法和乐观梯度增益算算法的变异性变异性变异性算法, 我们的伸缩率以标准差值为标准变异性变异性, 我们的变异性算法的核心值是正常的变异性, 我们的伸缩性算法的伸缩性能是不变性变数。