We use deep sparsely connected neural networks to measure the complexity of a function class in $L^2(\mathbb R^d)$ by restricting connectivity and memory requirement for storing the neural networks. We also introduce representation system - a countable collection of functions to guide neural networks, since approximation theory with representation system has been well developed in Mathematics. We then prove the fundamental bound theorem, implying a quantity intrinsic to the function class itself can give information about the approximation ability of neural networks and representation system. We also provides a method for transferring existing theories about approximation by representation systems to that of neural networks, greatly amplifying the practical values of neural networks. Finally, we use neural networks to approximate B-spline functions, which are used to generate the B-spline curves. Then, we analyse the complexity of a class called $\beta$ cartoon-like functions using rate-distortion theory and wedgelets construction.
翻译:通过限制神经网络存储的连通性和记忆要求,我们使用极小连接的神经网络来测量功能类别的复杂性($L2(\\mathbb R ⁇ d) 。 我们还引入了代表系统 — — 大量功能来引导神经网络,因为在数学中,近距离理论和代表系统已经得到了很好的发展。 然后,我们证明了基本约束定理,意味着功能类别本身固有的数量可以提供有关神经网络和代表系统的近距离能力的信息。 我们还提供了一种方法,将关于代表系统近距离近距离的现有理论转换到神经网络的理论,极大地扩大了神经网络的实际价值。 最后,我们用神经网络来接近B-spline功能,用来生成B-spline曲线。 然后,我们用速度扭曲理论和wedgelets的构造来分析一个名为$\beta$的类似卡通功能的复杂性。