We develop new theoretical results on matrix perturbation to shed light on the impact of architecture on the performance of a deep network. In particular, we explain analytically what deep learning practitioners have long observed empirically: the parameters of some deep architectures (e.g., residual networks, ResNets, and Dense networks, DenseNets) are easier to optimize than others (e.g., convolutional networks, ConvNets). Building on our earlier work connecting deep networks with continuous piecewise-affine splines, we develop an exact local linear representation of a deep network layer for a family of modern deep networks that includes ConvNets at one end of a spectrum and ResNets, DenseNets, and other networks with skip connections at the other. For regression and classification tasks that optimize the squared-error loss, we show that the optimization loss surface of a modern deep network is piecewise quadratic in the parameters, with local shape governed by the singular values of a matrix that is a function of the local linear representation. We develop new perturbation results for how the singular values of matrices of this sort behave as we add a fraction of the identity and multiply by certain diagonal matrices. A direct application of our perturbation results explains analytically why a network with skip connections (such as a ResNet or DenseNet) is easier to optimize than a ConvNet: thanks to its more stable singular values and smaller condition number, the local loss surface of such a network is less erratic, less eccentric, and features local minima that are more accommodating to gradient-based optimization. Our results also shed new light on the impact of different nonlinear activation functions on a deep network's singular values, regardless of its architecture.
翻译:我们开发了有关矩阵扰动的新理论结果,以揭示建筑结构对深网络性能的影响。特别是,我们从分析角度解释深深学习从业者长期观察到什么经验:一些深层结构的参数(例如残余网络、ResNets和Dense网络、DenseNets等)比其他结构(例如变动网络、ConvNets等)更容易优化。基于我们早先的工作,以连续的纸质化硬质样板连接深网络。我们开发了一个深深层次网络层的精确本地线性代表层的深层网络图层,这个网络包括频谱和ResNets、DenseNets等端的Convildal 等高级网络,其他连接的跳过的网络参数的参数参数。对于回归和分类工作来说,现代深层网络的优化损失表层面比参数要小得多,由基于本地直线性代表的奇特值管理。我们开发了一个新的深层次网络值,对于一个包含Convilental 线性网络的网络的更深层值的网络性值值,而其直径直径直径的直径直径网络的基值应用也是我们的一个直径网络结果。