In recent years, functional neural networks have been proposed and studied in order to approximate nonlinear continuous functionals defined on $L^p([-1, 1]^s)$ for integers $s\ge1$ and $1\le p<\infty$. However, their theoretical properties are largely unknown beyond universality of approximation or the existing analysis does not apply to the rectified linear unit (ReLU) activation function. To fill in this void, we investigate here the approximation power of functional deep neural networks associated with the ReLU activation function by constructing a continuous piecewise linear interpolation under a simple triangulation. In addition, we establish rates of approximation of the proposed functional deep ReLU networks under mild regularity conditions. Finally, our study may also shed some light on the understanding of functional data learning algorithms.
翻译:近年来,研究者提出了功能神经网络,以便逼近定义在$L^p([-1,1]^s)$上的非线性连续泛函,其中$s \geq 1$且$1\leq p<\infty$是整数。然而,除了逼近的普适性,这些函数神经网络的理论性质很大程度上是未知的,或者现有的分析方法不适用于ReLU激活函数。为了填补这一空白,我们通过在简单的三角剖分下构造连续分段线性插值,研究了与ReLU激活函数相关的功能深度神经网络的逼近能力。此外,我们在温和的正则条件下建立了所提出的函数深度ReLU网络的逼近速率。最后,本研究也可能为了解功能数据学习算法提供一点启示。