In this paper, we develop a framework for showing that neural networks can overcome the curse of dimensionality in different high-dimensional approximation problems. Our approach is based on the notion of a catalog network, which is a generalization of a standard neural network in which the nonlinear activation functions can vary from layer to layer as long as they are chosen from a predefined catalog of functions. As such, catalog networks constitute a rich family of continuous functions. We show that under appropriate conditions on the catalog, catalog networks can efficiently be approximated with rectified linear unit-type networks and provide precise estimates on the number of parameters needed for a given approximation accuracy. As special cases of the general results, we obtain different classes of functions that can be approximated with ReLU networks without the curse of dimensionality.
翻译:在本文中,我们开发了一个框架,以表明神经网络能够克服不同高维近似问题的维度诅咒。我们的方法基于目录网络的概念,即对标准神经网络的概括化,非线性激活功能只要从预先定义的功能目录中选择,就可以在层次上和层次上变化。因此,目录网络构成了一个由连续功能组成的丰富组合。我们表明,在目录的适当条件下,目录网络可以有效地与纠正的线性单位型网络相近,并准确估计某种近似准确性所需的参数数量。作为一般结果的特殊例子,我们获得了不同种类的功能,这些功能可以与RELU网络相近,而没有维度的诅咒。