We study the expressive power of deep ReLU neural networks for approximating functions in dilated shift-invariant spaces, which are widely used in signal processing, image processing, communications and so on. Approximation error bounds are estimated with respect to the width and depth of neural networks. The network construction is based on the bit extraction and data-fitting capacity of deep neural networks. As applications of our main results, the approximation rates of classical function spaces such as Sobolev spaces and Besov spaces are obtained. We also give lower bounds of the $L^p (1\le p \le \infty)$ approximation error for Sobolev spaces, which show that our construction of neural network is asymptotically optimal up to a logarithmic factor.
翻译:我们研究深ReLU神经网络的表达力,以接近变换空间的功能,这些空间广泛用于信号处理、图像处理、通信等。对神经网络宽度和深度的近似误差界限进行了估计。网络的构造以深神经网络的微小提取和数据适应能力为基础。作为我们主要结果的应用,索博列尔夫空间和贝索夫空间等古典功能空间的近似速率得到了实现。我们给索博列夫空间的近似误差幅度也较低(1\p\le p\le\infty),这表明我们建造神经网络在时间上与逻辑因素一样理想。