The crucial role played by the underlying symmetries of high energy physics and lattice field theories calls for the implementation of such symmetries in the neural network architectures that are applied to the physical system under consideration. In these proceedings, we focus on the consequences of incorporating translational equivariance among the network properties, particularly in terms of performance and generalization. The benefits of equivariant networks are exemplified by studying a complex scalar field theory, on which various regression and classification tasks are examined. For a meaningful comparison, promising equivariant and non-equivariant architectures are identified by means of a systematic search. The results indicate that in most of the tasks our best equivariant architectures can perform and generalize significantly better than their non-equivariant counterparts, which applies not only to physical parameters beyond those represented in the training set, but also to different lattice sizes.
翻译:高能物理和拉蒂场理论的内在对称性所起的关键作用要求在适用于所考虑的物理系统的神经网络结构中实施这种对称性。在这些程序中,我们侧重于将网络特性的翻译等同性纳入网络内的后果,特别是在性能和一般化方面。对等性网络的好处的例证是研究一种复杂的天平场理论,据以审查各种回归和分类任务。为了进行有意义的比较,通过系统搜索的方式确定了有希望的等异性和非等异性结构。结果显示,在大多数任务中,我们最好的等异性结构能够比非等性对应机构更好地执行和普及,这不仅适用于培训场中所代表的物理参数,而且适用于不同的花边尺寸。