We analyze the number of neurons that a ReLU neural network needs to approximate multivariate monomials. We establish an exponential lower bound for the complexity of any shallow network that approximates the product function $\vec{x} \to \prod_{i=1}^d x_i$ on a general compact domain. Furthermore, we prove that this lower bound does not hold for normalized O(1)-Lipschitz monomials (or equivalently, by restricting to the unit cube). These results suggest shallow ReLU networks suffer from the curse of dimensionality when expressing functions with a Lipschitz parameter scaling with the dimension of the input, and that the expressive power of neural networks lies in their depth rather than the overall complexity.
翻译:我们分析ReLU神经网络需要大约多变量单项神经网络的神经元数量。 我们为任何接近产品功能$\vec{x}\to\ prod<unk> i=1<unk> dx_i$的浅线网络的复杂程度设定了一个指数下下限。 此外, 我们证明这个下限并不支持常规 O(1)-Lipschitz单项神经网络( 或相同于限制单元立方体 ) 。 这些结果表明浅线性神经网络在表达与输入尺寸相适应的利普施茨参数的功能时,会受到维度的诅咒,神经网络的显性力量在它们深度而不是总体复杂性上。</s>