Analysing statistical properties of neural networks is a central topic in statistics and machine learning. However, most results in the literature focus on the properties of the neural network minimizing the training error. The goal of this paper is to consider aggregated neural networks using a Gaussian prior. The departure point of our approach is an arbitrary aggregate satisfying the PAC-Bayesian inequality. The main contribution is a precise nonasymptotic assessment of the estimation error appearing in the PAC-Bayes bound. Our analysis is sharp enough to lead to minimax rates of estimation over Sobolev smoothness classes.
翻译:分析神经网络的统计特性是统计和机器学习的一个中心议题。然而,文献中的大多数结果都集中在神经网络的特性上,最大限度地减少培训错误。本文件的目的是利用Gaussian 之前的数据来考虑综合神经网络。我们方法的出发点是任意地汇总满足PAC-Bayesian的不平等。我们的主要贡献是对PAC-Bayes 捆绑的估算错误进行精确的、非补救性的评估。我们的分析十分尖锐,足以导致在Sobolev 平滑的班级上得出最小的估计率。