When studying the expressive power of neural networks, a main challenge is to understand how the size and depth of the network affect its ability to approximate real functions. However, not all functions are interesting from a practical viewpoint: functions of interest usually have a polynomially-bounded Lipschitz constant, and can be computed efficiently. We call functions that satisfy these conditions "natural", and explore the benefits of size and depth for approximation of natural functions with ReLU networks. As we show, this problem is more challenging than the corresponding problem for non-natural functions. We give barriers to showing depth-lower-bounds: Proving existence of a natural function that cannot be approximated by polynomial-size networks of depth $4$ would settle longstanding open problems in computational complexity. It implies that beyond depth $4$ there is a barrier to showing depth-separation for natural functions, even between networks of constant depth and networks of nonconstant depth. We also study size-separation, namely, whether there are natural functions that can be approximated with networks of size $O(s(d))$, but not with networks of size $O(s'(d))$. We show a complexity-theoretic barrier to proving such results beyond size $O(d\log^2(d))$, but also show an explicit natural function, that can be approximated with networks of size $O(d)$ and not with networks of size $o(d/\log d)$. For approximation in $L_\infty$ we achieve such separation already between size $O(d)$ and size $o(d)$. Moreover, we show superpolynomial size lower bounds and barriers to such lower bounds, depending on the assumptions on the function. Our size-separation results rely on an analysis of size lower bounds for Boolean functions, which is of independent interest: We show linear size lower bounds for computing explicit Boolean functions with neural networks and threshold circuits.
翻译:当研究神经网络的表达力时,一个主要的挑战是如何理解网络的大小和深度如何影响其接近实际功能的能力。然而,并非所有功能都从实际的角度看是有趣的:利益功能通常具有多球型的Lipschitz常态,并且可以高效计算。我们叫这些功能满足这些条件的“自然”功能,并探索与RELU网络接近自然功能的大小和深度的好处。正如我们所显示的那样,这个问题比非自然功能的相应问题更具挑战性。我们给显示深度-低界的功能设置障碍:显示一个无法被深度多球型网络约合起来的自然功能的存在 4$的超球型网络将解决长期的开放问题,并且可以高效地计算。即使深度网络与不连续的网络和不连续的网络之间,我们也可以用更低的(d) 美元- 的大小, 也可以用更低的(d) 美元- 的网络的自然功能来显示这样的硬值。