Overparametrization is a key factor in the absence of convexity to explain global convergence of gradient descent (GD) for neural networks. Beside the well studied lazy regime, infinite width (mean field) analysis has been developed for shallow networks, using on convex optimization technics. To bridge the gap between the lazy and mean field regimes, we study Residual Networks (ResNets) in which the residual block has linear parametrization while still being nonlinear. Such ResNets admit both infinite depth and width limits, encoding residual blocks in a Reproducing Kernel Hilbert Space (RKHS). In this limit, we prove a local Polyak-Lojasiewicz inequality. Thus, every critical point is a global minimizer and a local convergence result of GD holds, retrieving the lazy regime. In contrast with other mean-field studies, it applies to both parametric and non-parametric cases under an expressivity condition on the residuals. Our analysis leads to a practical and quantified recipe: starting from a universal RKHS, Random Fourier Features are applied to obtain a finite dimensional parameterization satisfying with high-probability our expressivity condition.
翻译:超分化是无法解释神经网络中梯度下降(GD)全球趋同性差的一个关键因素。 在研究周密的懒惰制度之外,还利用软骨优化技术,为浅层网络开发了无限宽度(平均场)分析。为了缩小懒惰和中度实地制度之间的差距,我们研究了剩余区块具有线性平衡但仍然是非线性的残余网络(ResNets ) 。这类ResNets承认无限深度和宽度限制,将生产Kernel Hilbert空间(RKHS)的残余区块编码成可量化的配方。在这个限度内,我们证明了局部的Polyak-Lojasiewicz不平等性。因此,每一个临界点都是全球最小化点和GD的局部趋同性结果,重新利用懒惰性制度。与其他平均野外研究相比,它适用于在剩余区具有直线性条件下的对等分数和非对数性案例。我们的分析导致一种实用和量化的配方程式:从普遍的RKHS开始,随机四变的特性应用来获得一种精确的直径直截面参数。