This paper investigates the approximation properties of deep neural networks with piecewise-polynomial activation functions. We derive the required depth, width, and sparsity of a deep neural network to approximate any H\"{o}lder smooth function up to a given approximation error in H\"{o}lder norms in such a way that all weights of this neural network are bounded by $1$. The latter feature is essential to control generalization errors in many statistical and machine learning applications.
翻译:本文调查具有片段- 球形激活功能的深神经网络的近似特性。 我们从深神经网络中获得所需的深度、宽度和广度, 以接近任何 H\"{o} 光滑功能, 直至 H\"{ o} 焊接规范中给定的近似错误, 使这个神经网络的所有重量都受1美元的约束。 后一种特性对于控制许多统计和机器学习应用程序的概括错误至关重要 。