Parameter-space and function-space provide two different duality frames in which to study neural networks. We demonstrate that symmetries of network densities may be determined via dual computations of network correlation functions, even when the density is unknown and the network is not equivariant. Symmetry-via-duality relies on invariance properties of the correlation functions, which stem from the choice of network parameter distributions. Input and output symmetries of neural network densities are determined, which recover known Gaussian process results in the infinite width limit. The mechanism may also be utilized to determine symmetries during training, when parameters are correlated, as well as symmetries of the Neural Tangent Kernel. We demonstrate that the amount of symmetry in the initialization density affects the accuracy of networks trained on Fashion-MNIST, and that symmetry breaking helps only when it is in the direction of ground truth.
翻译:参数空间和功能空间提供两种不同的双重框架,用于研究神经网络。我们证明,网络密度的对称性可以通过网络相关功能的双重计算来确定,即使密度未知,而且网络也不是等同的。对称性依赖相关功能的不定特性,这些特性来自网络参数分布的选择。确定神经网络密度的输入和输出对称性,从而恢复已知的高斯进程导致无限宽限。在参数相关的情况下,还可以利用这一机制来确定培训过程中的对称性,以及神经唐内内核的对称性。我们证明,初始化密度的对称性影响在时尚-MNIST-MIT上培训的网络的准确性,而对称性断裂只有在地面真理方向上才有帮助。