In this paper, we analyze the number of neurons and training parameters that a neural networks needs to approximate multivariate functions of bounded second mixed derivatives -- Koborov functions. We prove upper bounds on these quantities for shallow and deep neural networks, breaking the curse of dimensionality. Our bounds hold for general activation functions, including ReLU. We further prove that these bounds nearly match the minimal number of parameters any continuous function approximator needs to approximate Koborov functions, showing that neural networks are near-optimal function approximators.
翻译:在本文中,我们分析神经网络需要多少神经元和培训参数,以近似交界的第二种混合衍生物 -- -- Koborov函数 -- -- 的多变功能。我们证明浅层和深层神经网络的这些数量有上方界限,打破了维度的诅咒。我们的界限是一般激活功能,包括RELU。我们进一步证明这些界限几乎符合任何连续功能近似Koborov函数的最小参数数量,表明神经网络是近于最佳功能的近端功能。