Stochastic variance reduced gradient (SVRG) is a popular variance reduction technique for stochastic gradient descent (SGD). We provide a first analysis of the method for solving a class of linear inverse problems in the lens of the classical regularization theory. We prove that for a suitable constant step size schedule, the method can achieve an optimal convergence rate in terms of the noise level (under suitable regularity condition) and the variance of the SVRG iterate error is smaller than that by SGD. These theoretical findings are corroborated by a set of numerical experiments.
翻译:电磁变差降低梯度(SVRG)是减少悬浮梯度梯度下降的流行技术(SGD),我们首次分析了在古典正规化理论的透镜中解决某类线性反问题的方法。我们证明,对于适当的固定步脚规模表,该方法可以在噪音水平(在适当的正常状态下)方面达到最佳趋同率,而SVRG迭代差错的偏差小于SGD。这些理论结论得到一系列数字实验的证实。