The Stochastic Extragradient (SEG) method is one of the most popular algorithms for solving min-max optimization and variational inequalities problems (VIP) appearing in various machine learning tasks. However, several important questions regarding the convergence properties of SEG are still open, including the sampling of stochastic gradients, mini-batching, convergence guarantees for the monotone finite-sum variational inequalities with possibly non-monotone terms, and others. To address these questions, in this paper, we develop a novel theoretical framework that allows us to analyze several variants of SEG in a unified manner. Besides standard setups, like Same-Sample SEG under Lipschitzness and monotonicity or Independent-Samples SEG under uniformly bounded variance, our approach allows us to analyze variants of SEG that were never explicitly considered in the literature before. Notably, we analyze SEG with arbitrary sampling which includes importance sampling and various mini-batching strategies as special cases. Our rates for the new variants of SEG outperform the current state-of-the-art convergence guarantees and rely on less restrictive assumptions.
翻译:软体外移法(SEG)是解决微轴优化和变异不平等问题最受欢迎的算法之一,它出现在各种机器学习任务中,然而,关于软体外移法(VIP)的趋同特性的若干重要问题仍然有待解决,包括随机梯度抽样、微型吸附、单体外移法(可能非单体外移)等单体外移法(SEG)的趋同保证和其他问题。为了解决这些问题,我们在本文件中开发了一个新颖的理论框架,使我们能够以统一的方式分析微模组的几种变异。除了标准设置,如在利普西特尼特和单体外采样或单体外采样SEG,我们的方法使我们能够分析以前从未在文献中明确考虑过的单一体梯度梯度梯度梯度的变,特别是我们用任意抽样分析SEG,其中包括重要的取样和各种微型比拼法作为特例。我们新的SEG变法的速率超越了当前状态的趋同保证,并依靠限制性程度较低的假设。