We show that deep belief networks with binary hidden units can approximate any multivariate probability density under very mild integrability requirements on the parental density of the visible nodes. The approximation is measured in the $L^q$-norm for $q\in[1,\infty]$ ($q=\infty$ corresponding to the supremum norm) and in Kullback-Leibler divergence. Furthermore, we establish sharp quantitative bounds on the approximation error in terms of the number of hidden units.
翻译:我们显示,带有二元隐藏单元的深信网络可以根据对可见节点父母密度的非常温和的融合要求,在任何多变量概率密度下进行近似。近似值用$q$-norm来测量[1,\infty]$[1,\infty]$(q ⁇ infty$,相当于 supremum 规范) 和Kullback- Leiber 差异。此外,我们用隐藏单位的数量来测量近似误差的精确数量界限 。