Assessing the complexity of functions computed by a neural network helps us understand how the network will learn and generalize. One natural measure of complexity is how the network distorts length - if the network takes a unit-length curve as input, what is the length of the resulting curve of outputs? It has been widely believed that this length grows exponentially in network depth. We prove that in fact this is not the case: the expected length distortion does not grow with depth, and indeed shrinks slightly, for ReLU networks with standard random initialization. We also generalize this result by proving upper bounds both for higher moments of the length distortion and for the distortion of higher-dimensional volumes. These theoretical results are corroborated by our experiments.
翻译:评估神经网络计算功能的复杂性有助于我们理解网络将如何学习和概括。一个自然的复杂度是网络如何扭曲长度――如果网络以单长曲线作为输入,结果曲线的长度是多少?人们广泛认为,这种长度在网络深度中成倍增长。我们证明事实并非如此:预期长度扭曲不会随着深度而增加,而实际上会略微缩小,对于使用标准随机初始化的RELU网络来说。我们还通过证明长度扭曲的较高时间和高维量的扭曲的上限来概括这一结果。这些理论结果得到我们的实验的证实。