A three-hidden-layer neural network with super approximation power is introduced. This network is built with the floor function ($\lfloor x\rfloor$), the exponential function ($2^x$), the step function ($1_{x\geq 0}$), or their compositions as the activation function in each neuron and hence we call such networks as Floor-Exponential-Step (FLES) networks. For any width hyper-parameter $N\in\mathbb{N}^+$, it is shown that FLES networks with width $\max\{d,N\}$ and three hidden layers can uniformly approximate a H\"older continuous function $f$ on $[0,1]^d$ with an exponential approximation rate $3\lambda (2\sqrt{d})^{\alpha} 2^{-\alpha N}$, where $\alpha \in(0,1]$ and $\lambda>0$ are the H\"older order and constant, respectively. More generally for an arbitrary continuous function $f$ on $[0,1]^d$ with a modulus of continuity $\omega_f(\cdot)$, the constructive approximation rate is $2\omega_f(2\sqrt{d}){2^{-N}}+\omega_f(2\sqrt{d}\,2^{-N})$. Moreover, we extend such a result to general bounded continuous functions on a bounded set $E\subseteq\mathbb{R}^d$. As a consequence, this new class of networks overcomes the curse of dimensionality in approximation power when the variation of $\omega_f(r)$ as $r\rightarrow 0$ is moderate (e.g., $\omega_f(r)\lesssim r^\alpha$ for H\"older continuous functions), since the major term to be concerned in our approximation rate is essentially $\sqrt{d}$ times a function of $N$ independent of $d$ within the modulus of continuity. Finally, we extend our analysis to derive similar approximation results in the $L^p$-norm for $p\in[1,\infty)$ via replacing Floor-Exponential-Step activation functions by continuous activation functions.
翻译:引入一个具有超近效的三层神经网络。 对于任何宽度超参数 $N\ refair{ mathb{ n%, 显示宽度为 $xx$, 指数函数 (2xxx\geq 0} 美元) 的 FLES 网络, 或它们构成为每个神经网络的激活功能。 对于任何宽度超参数 $N\ refailal_ dial_ mathb} 美元, 显示宽度为 $\ maxxx$, 美元和三个隐藏层的 FLES 网络, 可以统一大约 H\ old 美元连续函数 $1, 美元=d$1美元 美元, 以指数 3\\ qrt{ d} 。 ALpha} 2\\\\\\\ phalq} n美元, 以美元xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx