We study the space of functions computed by random-layered machines, including deep neural networks and Boolean circuits. Investigating the distribution of Boolean functions computed on the recurrent and layer-dependent architectures, we find that it is the same in both models. Depending on the initial conditions and computing elements used, we characterize the space of functions computed at the large depth limit and show that the macroscopic entropy of Boolean functions is either monotonically increasing or decreasing with the growing depth.
翻译:我们研究随机层机器计算功能的空间,包括深神经网络和布尔电路。调查在经常性和多层结构上计算的布尔函数的分布,我们发现这两种模型都是相同的。根据初始条件和使用的计算要素,我们确定在大深度范围内计算的功能的空间,并显示布尔函数的宏形环球随着深度的扩大而单质增加或减少。