We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any $d$-dimensional, smooth function on a compact set with a rate of order $W^{-p/d}$, where $W$ is the number of nonzero weights in the network and $p$ is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order $M^d$ achieve an approximation rate of $M^{-2p}$. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights $W_0$ in the network and show an approximation rate of $W_0^{-p/d}$. This more general result finally helps us to understand which network topology guarantees a special target accuracy.
翻译:我们研究的是具有微小活化功能的深神经网络的力量。 最近,我们发现DNN在按千瓦-p/d}的速率排列的契约下,大约能以任何美元维度、顺畅的功能,其中W$是网络中非零加权数,而$p美元是功能的顺畅度。不幸的是,这些速率只维持在连接极少的DNN的特殊类别中。我们问自己,我们是否能够显示一个简单和一般的类别,即仅按其宽度和深度定义的DNN的近似率。我们在本篇文章中显示,具有固定深度和宽度的DNNN达到近似速率$M%-2p}。作为结论,我们用网络中总重量的0.0美元来量化DNN的近似功率,并显示近似率为$W_0.%/p/d}。这一更为笼统的结果最终帮助我们了解哪个网络表层保证了特殊目标的准确性。