The generalization mystery of overparametrized deep nets has motivated efforts to understand how gradient descent (GD) converges to low-loss solutions that generalize well. Real-life neural networks are initialized from small random values and trained with cross-entropy loss for classification (unlike the "lazy" or "NTK" regime of training where analysis was more successful), and a recent sequence of results (Lyu and Li, 2020; Chizat and Bach, 2020; Ji and Telgarsky, 2020) provide theoretical evidence that GD may converge to the "max-margin" solution with zero loss, which presumably generalizes well. However, the global optimality of margin is proved only in some settings where neural nets are infinitely or exponentially wide. The current paper is able to establish this global optimality for two-layer Leaky ReLU nets trained with gradient flow on linearly separable and symmetric data, regardless of the width. The analysis also gives some theoretical justification for recent empirical findings (Kalimeris et al., 2019) on the so-called simplicity bias of GD towards linear or other "simple" classes of solutions, especially early in training. On the pessimistic side, the paper suggests that such results are fragile. A simple data manipulation can make gradient flow converge to a linear classifier with suboptimal margin.
翻译:过度平衡的深网的普遍化谜题促使人们努力理解梯度下沉(GD)如何融合到普遍化的低损解决方案。 真实生命神经网络从小随机值初始化,并经过跨热带损失的分类培训(不像分析更成功的“懒”或“NTK”培训制度),以及最近的一系列结果(Lyu和Li,2020年;Chizat和Bach,2020年;Ji和Telgarsky,2020年)提供了理论证据,证明GD可能以零损失(max-margin)融合到“max-margin”解决方案,这大概是全面的。然而,只有在神经网无限或极广的某些情况下,才证明全球比值的最佳性。目前的文件能够确定两层Laky ReLU网的全球最佳性,在线性塞和对称数据流方面,无论宽度如何宽度; Ji和Telgarsky,2020年)提供了一些理论依据,说明最近经验发现GD的“max-margin-margin ”解决方案(Kalils etalalalalalalalal road), lealy Acildal legilling leshal leging leging leshal leshal le:“ legildalalaldald legaltiald routaldaldaldal le le routaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldsaldsaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldal le le, le, le, le le ledaldaldaldaldaldaldaldaldaldaldaldaldal le le le le le, le le le le le le le