This paper considers deep neural networks for learning weakly dependent processes in a general framework that includes, for instance, regression estimation, time series prediction, time series classification. The $\psi$-weak dependence structure considered is quite large and covers other conditions such as mixing, association,$\ldots$ Firstly, the approximation of smooth functions by deep neural networks with a broad class of activation functions is considered. We derive the required depth, width and sparsity of a deep neural network to approximate any H\"{o}lder smooth function, defined on any compact set $\mx$. Secondly, we establish a bound of the excess risk for the learning of weakly dependent observations by deep neural networks. When the target function is sufficiently smooth, this bound is close to the usual $\mathcal{O}(n^{-1/2})$.
翻译:本文在包括回归估计、时间序列预测、时间序列分类等总框架内考虑学习依赖性弱过程的深神经网络。 所考虑的美元疲软依赖性结构相当大, 包括混合、 关联、 $\ldots$等其他条件。 首先, 考虑深神经网络的光滑功能近似, 具有广泛的激活功能类别。 我们从深神经网络获得所需的深度、宽度和宽度, 以接近任何 H\"{ o}lder 平滑功能, 定义在任何紧凑集 $\ mx$ 上。 其次, 我们为深神经网络学习依赖性弱的观测确定了超风险的界限。 当目标功能足够平滑时, 这个界限接近通常的 $\mathcal{O} (n\\\\\ 1/2} 美元 。