In this article we identify a general class of high-dimensional continuous functions that can be approximated by deep neural networks (DNNs) with the rectified linear unit (ReLU) activation without the curse of dimensionality. In other words, the number of DNN parameters grows at most polynomially in the input dimension and the approximation error. The functions in our class can be expressed as a potentially unbounded number of compositions of special functions which include products, maxima, and certain parallelized Lipschitz continuous functions.
翻译:本文确定了一类高维连续函数,可以使用具有修正线性单元(ReLU)激活的深度神经网络(DNN)逼近,且不存在数据维度灾难。换句话说,DNN参数数量在输入维度和逼近误差的多项式增长。我们定义的函数可以表达一些特殊函数的一个可能无界的组合,这些函数包括积、最大值和某些并行的Lipschitz连续函数。