Traditional analyses of gradient descent show that when the largest eigenvalue of the Hessian, also known as the sharpness $S(\theta)$, is bounded by $2/\eta$, training is "stable" and the training loss decreases monotonically. Recent works, however, have observed that this assumption does not hold when training modern neural networks with full batch or large batch gradient descent. Most recently, Cohen et al. (2021) observed two important phenomena. The first, dubbed progressive sharpening, is that the sharpness steadily increases throughout training until it reaches the instability cutoff $2/\eta$. The second, dubbed edge of stability, is that the sharpness hovers at $2/\eta$ for the remainder of training while the loss continues decreasing, albeit non-monotonically. We demonstrate that, far from being chaotic, the dynamics of gradient descent at the edge of stability can be captured by a cubic Taylor expansion: as the iterates diverge in direction of the top eigenvector of the Hessian due to instability, the cubic term in the local Taylor expansion of the loss function causes the curvature to decrease until stability is restored. This property, which we call self-stabilization, is a general property of gradient descent and explains its behavior at the edge of stability. A key consequence of self-stabilization is that gradient descent at the edge of stability implicitly follows projected gradient descent (PGD) under the constraint $S(\theta) \le 2/\eta$. Our analysis provides precise predictions for the loss, sharpness, and deviation from the PGD trajectory throughout training, which we verify both empirically in a number of standard settings and theoretically under mild conditions. Our analysis uncovers the mechanism for gradient descent's implicit bias towards stability.
翻译:传统的梯度下降分析显示,当赫森人的最大额值(也称为“S(theta)美元”)被2美元/美元(eta)美元所束缚时,培训是“稳定的”和训练损失是单数的。然而,最近的工作发现,当培训全批或大批量梯度下降的现代神经网络时,这一假设并不具备。最近,科恩等人(2021年)观察到两个重要现象。第一个被称为“逐渐变弱 ” 的现象是,在整个训练期间,在训练期间,在训练期间,精度增加的精度是2美元/美元,在训练期间,精度增加的精度下降是2美元,而在训练期间,精度的精度下降趋势是:在训练期间,精度下降的精度的精度下降趋势是,在稳定期间,稳定度的精度下降的精度是稳定性,在稳定期中,稳定期的精度稳定期,在稳定期中,稳定期的精度稳定期是稳定期,在稳定期中,稳定期中,稳定度的精度稳定期是稳定期,我们稳定期的精度稳定期,稳定期的精度稳定期,在稳定期中,稳定期中,稳定期的精度下,稳定度下,稳定度的精度的精度的精度是自我演的精度,我们自演的精度变化的精度的精度的精度,在自我演。