This paper studies the approximation capacity of ReLU neural networks with norm constraint on the weights. We prove upper and lower bounds on the approximation error of these networks for smooth function classes. The lower bound is derived through the Rademacher complexity of neural networks, which may be of independent interest. We apply these approximation bounds to analyze the convergences of regression using norm constrained neural networks and distribution estimation by GANs. In particular, we obtain convergence rates for over-parameterized neural networks. It is also shown that GANs can achieve optimal rate of learning probability distributions, when the discriminator is a properly chosen norm constrained neural network.
翻译:本文研究具有权重范数约束的ReLU神经网络的逼近容量。我们对平滑函数类别的逼近误差获得上下界。通过神经网络的Rademacher复杂度,我们推导出下界,这在独立的研究中可能显得有用。我们将这些逼近界限应用于分析回归使用规范约束的神经网络和通过GAN估计分布。特别地,在超参数神经网络上获得收敛速率。同时,我们还证明了当鉴别器是适当选择的规范约束神经网络时,GAN可实现学习概率分布的最优速率。