An example of an activation function $\sigma$ is given such that networks with activations $\{\sigma, \lfloor\cdot\rfloor\}$, integer weights and a fixed architecture depending on $d$ approximate continuous functions on $[0,1]^d$. The range of integer weights required for $\varepsilon$-approximation of H\"older continuous functions is derived, which leads to a convergence rate of order $n^{\frac{-2\beta}{2\beta+d}}\log_2n$ for neural network regression estimation of unknown $\beta$-H\"older continuous function with given $n$ samples.
翻译:给出了一个激活函数$gma 的示例, 包括: $ ⁇ sigma,\ lfloor\cd\rplop=$$, 整数加权数和固定结构, 取决于$ $0, 1 ⁇ d$ 的约美元连续函数 。 $\ varepsilon$- applor mation of H\\\ older 连续函数所需的整数加权数范围可以得出, 这导致神经网络回归估计值 $\beta}2\ beta+d ⁇ log_ 2n$ 。 未知的 $\beta$- H\\" older 连续函数以给定的 $为 。