Existing equivariant neural networks for continuous groups require discretization or group representations. All these approaches require detailed knowledge of the group parametrization and cannot learn entirely new symmetries. We propose to work with the Lie algebra (infinitesimal generators) instead of the Lie group.Our model, the Lie algebra convolutional network (L-conv) can learn potential symmetries and does not require discretization of the group. We show that L-conv can serve as a building block to construct any group equivariant architecture. We discuss how CNNs and Graph Convolutional Networks are related to and can be expressed as L-conv with appropriate groups. We also derive the MSE loss for a single L-conv layer and find a deep relation with Lagrangians used in physics, with some of the physics aiding in defining generalization and symmetries in the loss landscape. Conversely, L-conv could be used to propose more general equivariant ans\"atze for scientific machine learning.
翻译:连续群群的现有等离子神经网络(L-conv)可以了解潜在的对称性,而不需要小组的离子化。所有这些方法都需要对群准化有详细的了解,无法了解完全新的对称性。我们提议与Lie algebra(最小生成器)而不是Lie Group合作。我们的模型,Lie algebra convolutional网络(L-conv)可以了解潜在的对称性,而不需要小组的离子化。我们表明L-conv可以作为构建任何群异结构的建筑块。我们讨论CNN和图象相联网络与适当群的关系,并且可以以L-conv(L-conv)表示为L-conv。我们还为单一L-conv层的MSE损失出一个模型,并与物理学中使用的Lagrangians(L-congians)找到一个深层关系,一些物理学协助界定损失场景中的全称性和对称性。相反,L-conv可以用来提出更普遍的对等等式的“Ats”用于科学机器学习。