Graph Neural Networks (GNNs) are often used for tasks involving the geometry of a given graph, such as molecular dynamics simulation. Although the distance matrix of a geometric graph contains complete geometric information, it has been demonstrated that Message Passing Neural Networks (MPNNs) are insufficient for learning this geometry. In this work, we expand on the families of counterexamples that MPNNs are unable to distinguish from their distance matrices, by constructing families of novel and symmetric geometric graphs. We then propose $k$-DisGNNs, which can effectively exploit the rich geometry contained in the distance matrix. We demonstrate the high expressive power of our models and prove that some existing well-designed geometric models can be unified by $k$-DisGNNs as special cases. Most importantly, we establish a connection between geometric deep learning and traditional graph representation learning, showing that those highly expressive GNN models originally designed for graph structure learning can also be applied to geometric deep learning problems with impressive performance, and that existing complex, equivariant models are not the only solution. Experimental results verify our theory.
翻译:图神经网络(GNN)通常用于涉及给定图形的几何形状的任务,例如分子动力学模拟。虽然几何图形的距离矩阵包含完整的几何信息,但已经证明,消息传递神经网络(MPNN)不足以学习这种几何形状。在这项工作中,我们通过构建新的对称几何图形家族,扩展了MPNN无法从它们的距离矩阵中区分的反例系列。然后,我们提出了$k$-DisGNNs,它可以有效地利用距离矩阵中包含的丰富几何信息。我们展示了我们模型的高表达能力,并证明了一些现有的设计良好的几何模型可以作为$k$-DisGNNs的特殊情况统一。最重要的是,我们建立了几何深度学习和传统图形表示学习之间的联系,表明那些专门设计用于图形结构学习的高度表达能力的GNN模型也可以应用于几何深度学习问题,具有惊人的性能,并且现有的复杂等变模型并不是唯一的解决方案。实验结果验证了我们的理论。