Past research has indicated that the covariance of the Stochastic Gradient Descent (SGD) error done via minibatching plays a critical role in determining its regularization and escape from low potential points. Motivated by some new research in this area, we prove universality results by showing that noise classes that have the same mean and covariance structure of SGD via minibatching have similar properties. We mainly consider the Multiplicative Stochastic Gradient Descent (M-SGD) algorithm as introduced in previous work, which has a much more general noise class than the SGD algorithm done via minibatching. We establish non asymptotic bounds for the M-SGD algorithm in the Wasserstein distance. We also show that the M-SGD error is approximately a scaled Gaussian distribution with mean $0$ at any fixed point of the M-SGD algorithm.
翻译:以往的研究显示,通过微型连接完成的Stochatic Gradient Emprole(SGD)错误的共变性在确定其正规化和从低潜力点逃出方面起着关键作用。受这一领域一些新研究的推动,我们通过微型连接显示具有相同中值和共变结构的SGD的噪音类别具有相似的特性,从而证明普遍性。我们主要认为以前工作中引入的多复制性Stochatic Erdient Emprole(M-SGD)算法(M-SGD)算法(M-SGD)算法(MGD算法)具有比微型连接算法(SGD)算法(SGD)更普遍的噪音类别。我们在瓦瑟斯坦距离上为M-SGD算法(M-SGD)算法(M-SGD)算法(M-SGD)算法(M-SGD)算法(在任何固定点,均以0.美元为平均值)的缩度分布。</s>