In this paper, we consider to improve the stochastic variance reduce gradient (SVRG) method via incorporating the curvature information of the objective function. We propose to reduce the variance of stochastic gradients using the computationally efficient Barzilai-Borwein (BB) method by incorporating it into the SVRG. We also incorporate a BB-step size as its variant. We prove its linear convergence theorem that works not only for the proposed method but also for the other existing variants of SVRG with second-order information. We conduct the numerical experiments on the benchmark datasets and show that the proposed method with constant step size performs better than the existing variance reduced methods for some test problems.
翻译:在本文中,我们考虑通过纳入客观功能的曲线信息来改进随机偏差降低梯度的方法(SVRG),我们提议采用计算效率高的巴齐莱-博尔温(BB)方法,将这种方法纳入SVRG,减少随机梯度的差异,我们还将BB级大小作为变体。我们证明,其线性趋同理论不仅适用于拟议方法,而且适用于具有二级信息的SVRG其他现有变体。我们在基准数据集上进行数字实验,并表明,定步法比现有降低某些测试问题的方法效果更好。