We study the optimization problem associated with fitting two-layer ReLU neural networks with respect to the squared loss, where labels are generated by a target network. We make use of the rich symmetry structure to develop a novel set of tools for studying families of spurious minima. In contrast to existing approaches which operate in limiting regimes, our technique directly addresses the nonconvex loss landscape for a finite number of inputs $d$ and neurons $k$, and provides analytic, rather than heuristic, information. In particular, we derive analytic estimates for the loss at different minima, and prove that modulo $O(d^{-1/2})$-terms the Hessian spectrum concentrates near small positive constants, with the exception of $\Theta(d)$ eigenvalues which grow linearly with~$d$. We further show that the Hessian spectrum at global and spurious minima coincide to $O(d^{-1/2})$-order, thus challenging our ability to argue about statistical generalization through local curvature. Lastly, our technique provides the exact \emph{fractional} dimensionality at which families of critical points turn from saddles into spurious minima. This makes possible the study of the creation and the annihilation of spurious minima using powerful tools from equivariant bifurcation theory.
翻译:我们研究与平方损失相关的两层ReLU神经网络安装两层ReLU神经网络的优化问题,其中标签由目标网络产生。我们利用丰富的对称结构开发一套新颖的工具,用于研究虚假迷你家庭。与在限制制度下运作的现有方法相比,我们的技术直接解决了数量有限的投入(美元)和神经元(美元)的非Convex损失场景,并提供了分析性而非超常信息。特别是,我们得出了不同迷你损失的分析性估计,并证明Hesian频谱的模型($O(d ⁇ -1/2})的期价是接近微正常数的。除了在限制制度下运行的现有方法外,我们的技术直接解决了数量有限的投入(美元)和神经元(美元)的损失场景色,我们还进一步表明,全球的Hesian频谱和迷你迷你迷你迷你迷你迷你迷你的迷你空间与$O(d ⁇ -1/2}秩序相吻合,从而挑战了我们通过地方曲线(e curval) ortial) 将统计概括化能力争论通过地方的底结构进行争论的能力。最后,我们的技术提供了可能的翻动的策略, 的造型的翻动。