The expressive power of Graph Neural Networks (GNNs) has been studied extensively through the Weisfeiler-Leman (WL) graph isomorphism test. However, standard GNNs and the WL framework are inapplicable for geometric graphs embedded in Euclidean space, such as biomolecules, materials, and other physical systems. In this work, we propose a geometric version of the WL test (GWL) for discriminating geometric graphs while respecting the underlying physical symmetries: permutations, rotation, reflection, and translation. We use GWL to characterise the expressive power of geometric GNNs that are invariant or equivariant to physical symmetries in terms of distinguishing geometric graphs. GWL unpacks how key design choices influence geometric GNN expressivity: (1) Invariant layers have limited expressivity as they cannot distinguish one-hop identical geometric graphs; (2) Equivariant layers distinguish a larger class of graphs by propagating geometric information beyond local neighbourhoods; (3) Higher order tensors and scalarisation enable maximally powerful geometric GNNs; and (4) GWL's discrimination-based perspective is equivalent to universal approximation. Synthetic experiments supplementing our results are available at https://github.com/chaitjo/geometric-gnn-dojo
翻译:通过Weisfeiler-Leman(WL)图形的形态式测试,广泛研究了图形神经网络(GNN)的表达力。然而,标准的GNN和WL框架不适用于嵌入Euclidean空间的几何图形,如生物分子、材料和其他物理系统。在这项工作中,我们提议了用于区分几何图形的WL测试的几何版(GWL),同时尊重基本的物理对称:调整、旋转、反射和翻译。我们使用GWL来描述在区分几何图形时不易变或等同物理对称的几何式GNNNNG框架。 GWL将关键设计选择如何影响几何等 GNNT 表达性:(1) 异性层的表达性有限,因为它们无法区分一等同的几何图形;(2) 等同层通过支持几何信息来区分较大型的图表;(3) 高偏差的GNDRI/Simalizalizal化结果是我们通用的更高顺序和同步的GNML。