We consider approximation rates of sparsely connected deep rectified linear unit (ReLU) and rectified power unit (RePU) neural networks for functions in Besov spaces $B^\alpha_{q}(L^p)$ in arbitrary dimension $d$, on general domains. We show that \alert{deep rectifier} networks with a fixed activation function attain optimal or near to optimal approximation rates for functions in the Besov space $B^\alpha_{\tau}(L^\tau)$ on the critical embedding line $1/\tau=\alpha/d+1/p$ for \emph{arbitrary} smoothness order $\alpha>0$. Using interpolation theory, this implies that the entire range of smoothness classes at or above the critical line is (near to) optimally approximated by deep ReLU/RePU networks.
翻译:我们认为,在Besov空间任意尺寸功能的微小连接深修正线性单元(ReLU)和校正电动单元(REPU)神经网络(REPU)的近似速率($B ⁇ alpha ⁇ q}(L ⁇ p)美元)在一般域的任意维度功能中为$d美元。我们表明,具有固定激活功能的网络在Besov空间的功能中达到最佳或接近最佳近似近似速率($B ⁇ alpha ⁇ tau}(L ⁇ tau)$的关键嵌入线中为$/tau ⁇ alpha/d+1/p$($emph{任意性}光滑度排序为$\alpha>0美元)。根据内推论,这意味着,在关键线上或上的所有平滑度等级都(接近)为深REU/REP网络的最佳近似值。