We show that $d$-variate polynomials of degree $R$ can be represented on $[0,1]^d$ as shallow neural networks of width $d+1+\sum_{r=2}^R\binom{r+d-1}{d-1}[\binom{r+d-1}{d-1}+1]$. Also, by SNN representation of localized Taylor polynomials of univariate $C^\beta$-smooth functions, we derive for shallow networks the minimax optimal rate of convergence, up to a logarithmic factor, to unknown univariate regression function.
翻译:我们显示,以$[0,1]美元为单位的变动多度度功能,可以以$[0,1]美元为单位,作为宽度为d+1 ⁇ sum ⁇ r=2 ⁇ R\binom{r+d-1 ⁇ d-1}[\binom{r+d-1 ⁇ d-1 ⁇ d-1 ⁇ 1}的浅层神经网络。此外,通过SNN为单位,以单位为单位的单位,以单位为单位为单位,以浅层网络为单位,得出最小最大最佳趋同率,最高为对数系数,最高为未知的单向回归函数。