We study the stochastic bilinear minimax optimization problem, presenting an analysis of the same-sample Stochastic ExtraGradient (SEG) method with constant step size, and presenting variations of the method that yield favorable convergence. In sharp contrasts with the basic SEG method whose last iterate only contracts to a fixed neighborhood of the Nash equilibrium, SEG augmented with iteration averaging provably converges to the Nash equilibrium under the same standard settings, and such a rate is further improved by incorporating a scheduled restarting procedure. In the interpolation setting where noise vanishes at the Nash equilibrium, we achieve an optimal convergence rate up to tight constants. We present numerical experiments that validate our theoretical findings and demonstrate the effectiveness of the SEG method when equipped with iteration averaging and restarting.
翻译:我们研究双线微轴优化问题,分析同一模样的外形外形外形法(SEG),分析步数不变,并展示产生有利趋同的方法的变异。 与SEG基本法截然不同的是,SEG法最后只将合同转至纳什平衡的固定区域,SEG通过迭代增增,在同一标准环境下平均可辨别到纳什平衡,而这种速率通过纳入预定的重新启动程序得到进一步的改进。 在纳什平衡噪音消失的内推环境中,我们实现了最佳的趋同率,达到紧凑的常数。 我们提出了数字实验,验证了我们的理论发现,并展示了SEG方法在配备迭代平均和重现时的有效性。