We study the problem of learning classification functions from noiseless training samples, under the assumption that the decision boundary is of a certain regularity. We establish universal lower bounds for this estimation problem, for general classes of continuous decision boundaries. For the class of locally Barron-regular decision boundaries, we find that the optimal estimation rates are essentially independent of the underlying dimension and can be realized by empirical risk minimization methods over a suitable class of deep neural networks. These results are based on novel estimates of the $L^1$ and $L^\infty$ entropies of the class of Barron-regular functions.
翻译:我们研究从无噪音培训样本中学习无噪音培训功能的问题,假设决定界限具有一定的规律性;我们为这一估计问题为连续决定界限的一般类别确定通用的较低界限;关于当地Barron经常决定界限的类别,我们发现,最佳估计率基本上独立于基本层面,并且可以通过在适当的深层神经网络类别中以实证风险最小化方法实现;这些结果基于对Barron经常职能类别中1美元和1美元的新估计。