We show that deep sparse ReLU networks with ternary weights and deep ReLU networks with binary weights can approximate $\beta$-H\"older functions on $[0,1]^d$. Also, for any interval $[a,b)\subset\mathbb{R}$, continuous functions on $[0,1]^d$ can be approximated by networks of depth $2$ with binary activation function $\mathds{1}_{[a,b)}$.
翻译:我们显示,具有长期重量和带有二元重量的深重ReLU网络的深海稀疏ReLU网络在$[0,1,1'd$]上可以大约为$\beta$-H\'older 函数。此外,对于任何间隔,$[(a,b)\subset\mathb{R}$,具有二元激活功能的深度网络可以大约为$[0,1'd$] $2$的连续函数。