Among the very first variance reduced stochastic methods for solving the empirical risk minimization problem was the SVRG method (Johnson & Zhang 2013). SVRG is an inner-outer loop based method, where in the outer loop a reference full gradient is evaluated, after which $m \in \mathbb{N}$ steps of an inner loop are executed where the reference gradient is used to build a variance reduced estimate of the current gradient. The simplicity of the SVRG method and its analysis have led to multiple extensions and variants for even non-convex optimization. We provide a more general analysis of SVRG than had been previously done by using arbitrary sampling, which allows us to analyse virtually all forms of mini-batching through a single theorem. Furthermore, our analysis is focused on more practical variants of SVRG including a new variant of the loopless SVRG (Hofman et al 2015, Kovalev et al 2019, Kulunchakov and Mairal 2019) and a variant of k-SVRG (Raj and Stich 2018) where $m=n$ and where $n$ is the number of data points. Since our setup and analysis reflect what is done in practice, we are able to set the parameters such as the mini-batch size and step size using our theory in such a way that produces a more efficient algorithm in practice, as we show in extensive numerical experiments.
翻译:解决风险最小化的经验风险最小化问题的第一个差异减少的随机方法之一是SVRG方法(Johnson & Zhang,2013年)。SVRG是一种以内外环环法,在外环中评价参考全梯度,然后在使用参考梯度以建立降低当前梯度的差异估计值时,执行一个内环步骤。SVRG方法的简单性及其分析导致甚至非convex优化的多种扩展和变体。我们通过任意抽样,对SVRG进行了比以前更一般性的分析,这使我们能够在外环中通过单一的理论来分析几乎所有形式的小型拼拼图。此外,我们的分析侧重于SVRG的更实用的变体,包括无循环的SVRG(Hofman等人等人,2015年,Kovalev等人,Kulunchakov和Mairalal,2019年)和K-SVRG(Raj and Stich 2018年) 的变体。我们使用任意抽样抽样方法,对SVRG(RG)进行了比以前更一般性的分析, $=nal=nex ex ex ex ex ex exalmax ex laus laus laus laus laus lapping the the laus pal a lap lapping a lax a lax a lipal lad lad lad lad ladal) lad laus laus ladal a lad lad lad lad lad lad lad lad lad lad lad lad lad lad laus lad lad lad lad lad lad lad lad lad lad lad ladal lad lad lad lad ladaldal lad ladal lad lad ladal ladal lad lad ladal ladal ladal la la la la la la la la la la la la la la