We show that deep sparse ReLU networks with ternary weights and deep ReLU networks with binary weights can approximate $\beta$-H\"older functions on $[0,1]^d$. Also, H\"older continuous functions on $[0,1]^d$ can be approximated by networks of depth $2$ with binary activation function $\mathds{1}_{[0,1)}$.
翻译:我们显示,具有长期重量的深稀ReLU网络和具有二进制重量的深ReLU网络,在$[0,1,1'd$]上可以约合$\beta$-H\'older 函数。此外,具有二进制活化功能的深度网络可以近似于$[0,1'd$]$2$的“H”连续函数。