High-dimensional depth separation results for neural networks show that certain functions can be efficiently approximated by two-hidden-layer networks but not by one-hidden-layer ones in high-dimensions $d$. Existing results of this type mainly focus on functions with an underlying radial or one-dimensional structure, which are usually not encountered in practice. The first contribution of this paper is to extend such results to a more general class of functions, namely functions with piece-wise oscillatory structure, by building on the proof strategy of (Eldan and Shamir, 2016). We complement these results by showing that, if the domain radius and the rate of oscillation of the objective function are constant, then approximation by one-hidden-layer networks holds at a $\mathrm{poly}(d)$ rate for any fixed error threshold. A common theme in the proof of such results is the fact that one-hidden-layer networks fail to approximate high-energy functions whose Fourier representation is spread in the domain. On the other hand, existing approximation results of a function by one-hidden-layer neural networks rely on the function having a sparse Fourier representation. The choice of the domain also represents a source of gaps between upper and lower approximation bounds. Focusing on a fixed approximation domain, namely the sphere $\mathbb{S}^{d-1}$ in dimension $d$, we provide a characterization of both functions which are efficiently approximable by one-hidden-layer networks and of functions which are provably not, in terms of their Fourier expansion.
翻译:神经网络的高度深度分离结果显示,某些功能可以高效地被两个隐藏的层网络所近似,但不能被一个隐藏的层网络所近似。 此类类型的现有结果主要侧重于具有基本辐射或一维结构的功能, 而这些功能通常在实践中并不存在。 本文的第一个贡献是将这类结果推广到一个更一般性的功能类别, 即以片断的螺旋结构为基础的功能。 其方法是建立两个隐藏的层网络( Eldan 和 Shamir, 2016年)。 我们补充这些结果的方法是显示, 如果目标功能的域半径和振动率是恒定的, 那么一个隐藏的层网络的近似值值将维持在 $\ mathrm{poly} (d) 任何固定误差阈值中。 证明这种结果的一个共同主题是, 一层-1 级网络无法接近高能量功能, 其四层代表着一个基域域域域域域域域域域际的一维值, 也代表着一个基底的基值。