We establish optimal convergence rates up to a log-factor for a class of deep neural networks in a classification setting under a restraint sometimes referred to as the Tsybakov noise condition. We construct classifiers in a general setting where the boundary of the bayes-rule can be approximated well by neural networks. Corresponding rates of convergence are proven with respect to the misclassification error. It is then shown that these rates are optimal in the minimax sense if the boundary satisfies a smoothness condition. Non-optimal convergence rates already exist for this setting. Our main contribution lies in improving existing rates and showing optimality, which was an open problem. Furthermore, we show almost optimal rates under some additional restraints which circumvent the curse of dimensionality. For our analysis we require a condition which gives new insight on the restraint used. In a sense it acts as a requirement for the "correct noise exponent" for a class of functions.
翻译:我们在一个有时被称为Tsybakov噪声条件的限制下,在分类环境中为一类深神经网络设定了最佳趋同率,达到一个对数系数。我们在一个总环境中构建了分类器,在这个总环境中,刺线的边界可以与神经网络相近。相应的趋同率在分类错误方面得到了证明。然后,可以证明,如果边界满足一个平稳的状态,这些速率在迷你轴上是最佳的。对于这个环境来说,非最佳趋同率已经存在。我们的主要贡献在于改进现有比率和显示最佳性,这是一个开放的问题。此外,我们在一些额外的限制下展示了几乎最佳的速率,这些限制避开了维度的诅咒。我们的分析需要有一个条件,对所使用的约束进行新的洞察。从某种意义上说,它作为某类功能的“更正噪声提示”的一项要求。