We state concentration inequalities for the output of the hidden layers of a stochastic deep neural network (SDNN), as well as for the output of the whole SDNN. These results allow us to introduce an expected classifier (EC), and to give probabilistic upper bound for the classification error of the EC. We also state the optimal number of layers for the SDNN via an optimal stopping procedure. We apply our analysis to a stochastic version of a feedforward neural network with ReLU activation function.
翻译:我们提出了随机深度神经网络(SDNN)隐藏层的输出以及整个SDNN输出的浓度不等式。这些结果使我们能够引入期望分类器(EC),并为EC的分类误差给出概率上界。我们还通过最优停止策略确定了SDNN的最优层数。我们将我们的分析应用于带有ReLU激活函数的前馈神经网络的随机版本。