The extragradient (EG), introduced by G. M. Korpelevich in 1976, is a well-known method to approximate solutions of saddle-point problems and their extensions such as variational inequalities and monotone inclusions. Over the years, numerous variants of EG have been proposed and studied in the literature. Recently, these methods have gained popularity due to new applications in machine learning and robust optimization. In this work, we survey the latest developments in the EG method and its variants for approximating solutions of nonlinear equations and inclusions, with a focus on the monotonicity and co-hypomonotonicity settings. We provide a unified convergence analysis for different classes of algorithms, with an emphasis on sublinear best-iterate and last-iterate convergence rates. We also discuss recent accelerated variants of EG based on both Halpern fixed-point iteration and Nesterov's accelerated techniques. Our approach uses simple arguments and basic mathematical tools to make the proofs as elementary as possible, while maintaining generality to cover a broad range of problems.
翻译:Extragradient(EG)是一种经典的方法,由G.M. Korpelevich在1976年引入,用于近似解决鞍点问题及其扩展问题,如变分不等式和单调包含。多年来,文献中提出和研究了EG的大量变体。最近,由于在机器学习和鲁棒优化中的新应用,这些方法变得越来越受欢迎。在本文中,我们综述了EG方法及其用于近似解决非线性方程和包含问题的变体的最新进展,重点放在单调性和共协调性设置上。我们提供了不同类别算法的统一收敛分析,强调次线性最佳迭代和最后迭代收敛率。我们还讨论了最近基于Halpern不动点迭代和Nesterov的加速技术的加速EG变体。我们的方法使用简单论证和基本数学工具,使证明尽可能简单,同时保持了覆盖广泛问题的通用性。