We provide a full characterisation of all of the possible group equivariant neural networks whose layers are some tensor power of $\mathbb{R}^{n}$ for three symmetry groups that are missing from the machine learning literature: $O(n)$, the orthogonal group; $SO(n)$, the special orthogonal group; and $Sp(n)$, the symplectic group. In particular, we find a spanning set of matrices for the learnable, linear, equivariant layer functions between such tensor power spaces in the standard basis of $\mathbb{R}^{n}$ when the group is $O(n)$ or $SO(n)$, and in the symplectic basis of $\mathbb{R}^{n}$ when the group is $Sp(n)$. The neural networks that we characterise are simple to implement since our method circumvents the typical requirement when building group equivariant neural networks of having to decompose the tensor power spaces of $\mathbb{R}^{n}$ into irreducible representations. We also describe how our approach generalises to the construction of neural networks that are equivariant to local symmetries. The theoretical background for our results comes from the Schur-Weyl dualities that were established by Brauer in his 1937 paper "On Algebras Which are Connected with the Semisimple Continuous Groups" for each of the three groups in question. We suggest that Schur-Weyl duality is a powerful mathematical concept that could be used to understand the structure of neural networks that are equivariant to groups beyond those considered in this paper.
翻译:我们提供了所有可能的集团等离异性神经网络的完整特性。 特别是, 我们发现在机器学习文献中缺少的三个对称组中, 其层层为$\mathb{R ⁇ }$( $) 或美元( $) 的强力, 3个对称组中缺少的 : $( 美元) 美元, 正方形组; $SO( 美元), 特殊正方形组; 和 $Sp( n), 共性组。 我们描述的神经网络很简单, 因为当在 $( mathb{ R ⁇ } 标准基中, 在 3个对等异性组中, 等离异性组中, 等离异性组中, 等离异性组中, 等离异性组中, 等- 等离异性组中, 等- 等- 等- 等- 变性网络中, 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等- 等