We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function. These bounds provide explicit estimates on the approximation error with respect to the size of the neural networks. We show that tanh neural networks with only two hidden layers suffice to approximate functions at comparable or better rates than much deeper ReLU neural networks.
翻译:在高阶索博列夫规范中,我们从错误中得出界限,它发生在索博列夫常规的近似值以及神经网络与超曲正切激活功能之间的分析函数中。这些界限提供了神经网络规模近似值误差的明确估计。我们显示,只有两个隐藏层的日光神经网络与深得多的RELU神经网络相比,足以以可比或更高的速率达到近似值。