Convolutional neural networks have shown impressive abilities in many applications, especially those related to the classification tasks. However, for the regression problem, the abilities of convolutional structures have not been fully understood, and further investigation is needed. In this paper, we consider the mean squared error analysis for deep convolutional neural networks. We show that, for additive ridge functions, convolutional neural networks followed by one fully connected layer with ReLU activation functions can reach optimal mini-max rates (up to a log factor). The input dimension only appears in the constant of convergence rates. This work shows the statistical optimality of convolutional neural networks and may shed light on why convolutional neural networks are able to behave well for high dimensional input.
翻译:革命神经网络在许多应用中表现出令人印象深刻的能力,特别是与分类任务有关的应用。然而,对于回归问题,尚未充分理解革命结构的能力,因此需要进一步调查。在本文件中,我们考虑了深革命神经网络的平均平方误差分析。我们表明,对于添加脊柱功能,革命神经网络,然后有一个与RELU激活功能完全相连的层,可以达到最优的微型峰值(直至日志系数 ) 。输入维度仅出现在趋同率的不变中。这项工作显示了共革命神经网络的统计最佳性,并可以说明共生神经网络为什么能够很好地运行高维度输入。