Analysing statistical properties of neural networks is a central topic in statistics and machine learning. However, most results in the literature focus on the properties of the neural network minimizing the training error. The goal of this paper is to consider aggregated neural networks using a Gaussian prior. The departure point of our approach is an arbitrary aggregate satisfying the PAC-Bayesian inequality. The main contribution is a precise nonasymptotic assessment of the estimation error appearing in the PAC-Bayes bound. We also review available bounds on the error of approximating a function by a neural network. Combining bounds on estimation and approximation errors, we establish risk bounds that are sharp enough to lead to minimax rates of estimation over Sobolev smoothness classes.
翻译:分析神经网络的统计特性是统计和机器学习的一个中心议题。然而,文献中的大多数结果都集中在神经网络的特性上,最大限度地减少培训错误。本文件的目的是利用Gaussian 之前使用一个Gaussian 来考虑综合神经网络。我们方法的出发点是任意的汇总,满足PAC-Bayesian 的不平等。我们的主要贡献是对PAC-Bayes 捆绑中出现的估算错误进行精确的无症状评估。我们还审查了神经网络接近功能的错误的可用界限。将估计和近似错误的界限结合起来,我们确定的风险界限足以导致对Sobolev 平滑等级进行最小的估算。