In a recent work arXiv:2008.08601, Halverson, Maiti and Stoner proposed a description of neural networks in terms of a Wilsonian effective field theory. The infinite-width limit is mapped to a free field theory, while finite $N$ corrections are taken into account by interactions (non-Gaussian terms in the action). In this paper, we study two related aspects of this correspondence. First, we comment on the concepts of locality and power-counting in this context. Indeed, these usual space-time notions may not hold for neural networks (since inputs can be arbitrary), however, the renormalization group provides natural notions of locality and scaling. Moreover, we comment on several subtleties, for example, that data components may not have a permutation symmetry: in that case, we argue that random tensor field theories could provide a natural generalization. Second, we improve the perturbative Wilsonian renormalization from arXiv:2008.08601 by providing an analysis in terms of the nonperturbative renormalization group using the Wetterich-Morris equation. An important difference with usual nonperturbative RG analysis is that only the effective (IR) 2-point function is known, which requires setting the problem with care. Our aim is to provide a useful formalism to investigate neural networks behavior beyond the large-width limit (i.e.~far from Gaussian limit) in a nonperturbative fashion. A major result of our analysis is that changing the standard deviation of the neural network weight distribution can be interpreted as a renormalization flow in the space of networks. We focus on translations invariant kernels and provide preliminary numerical results.
翻译:在最近的工作 arxiv: 2008. 08601, Halverson、 Maiti 和 Stone 中, Halversion、 Maiti 和 Stone 提出了一个以威尔逊式有效战地理论来描述神经网络的描述。 无限宽幅限制被映射为自由战地理论, 而有限的美元校正则被互动( 行动中的非加西语术语 ) 考虑在内。 在本文中, 我们研究了该函文的两个相关方面。 首先, 我们在此背景下评论地点和电量计算的概念。 事实上, 这些通常的时空时间概念可能无法维持在神经网络上( 因为投入可能是任意的), 然而, 重新整形组提供了自然的地平流权度和缩缩缩缩度概念。 此外, 我们评论了一些微妙的值限制, 例如, 数据组件可能不具有调调和性, 而我们通常的内空域网络的变换, 只能提供一个非超常常变的内空值分析结果 。 在常规的轨道上, 需要一个大的内径变的内核函数 。, 需要一个不常变的内径分析, 我们的内变的内变的内核的内核的变为一个重要的 。