Estimation of a regression function from independent and identically distributed random variables is considered. The $L_2$ error with integration with respect to the design measure is used as an error criterion. Over-parametrized deep neural network estimates are defined where all the weights are learned by the gradient descent. It is shown that the expected $L_2$ error of these estimates converges to zero with the rate close to $n^{-1/(1+d)}$ in case that the regression function is H\"older smooth with H\"older exponent $p \in [1/2,1]$. In case of an interaction model where the regression function is assumed to be a sum of H\"older smooth functions where each of the functions depends only on $d^*$ many of $d$ components of the design variable, it is shown that these estimates achieve the corresponding $d^*$-dimensional rate of convergence.
翻译:对独立和相同分布的随机变量的回归函数进行估算。在设计计量中,将整合后的$L_2美元错误用作错误标准。在通过梯度下降学得所有重量的地方,将超平衡的深神经网络估计数定义为所有偏差。显示这些估计数的预期$L_2美元错误会合为零,如果回归函数为H\"older sliver with H\"older expent $p\in[1/2/1]$],则这些回归函数为H\"older expent $p\in[1/2]$]。如果互动模型假设回归函数为H\\"老光滑函数的和,而每个函数仅依赖设计变量中许多美元元元元元元元的数值,则显示这些估计数将达到相应的 $d ⁇ $-维趋同率。