Message passing graph neural networks (GNNs) are known to have their expressiveness upper-bounded by 1-dimensional Weisfeiler-Lehman (1-WL) algorithm. To achieve more powerful GNNs, existing attempts either require ad hoc features, or involve operations that incur high time and space complexities. In this work, we propose a general and provably powerful GNN framework that preserves the scalability of message passing scheme. In particular, we first propose to empower 1-WL for graph isomorphism test by considering edges among neighbors, giving rise to NC-1-WL. The expressiveness of NC-1-WL is shown to be strictly above 1-WL but below 3-WL theoretically. Further, we propose the NC-GNN framework as a differentiable neural version of NC-1-WL. Our simple implementation of NC-GNN is provably as powerful as NC-1-WL. Experiments demonstrate that our NC-GNN achieves remarkable performance on various benchmarks.
翻译:众所周知,电文传递图神经网络(GNN)的表达性被一维Weisfeiler-Lehman(1-WL)算法压倒了。为了实现更强大的GNN,现有的尝试要么需要特别的特点,要么涉及具有高时空复杂性的操作。在这项工作中,我们提议了一个普遍和可证实强大的GNN框架,以维护电文传递计划的可扩缩性。特别是,我们首先提议通过考虑邻国之间的边缘,从而产生NC-1-WL,来赋予1-WL的图像性偏向性测试能力。NC-1-WL的表达性被证明严格地超过1-WL,但在理论上低于3-WL。我们提议NC-GNN框架作为NC-1-W的可区别性神经版本。我们对NC-GNN的简单实施与NC-1-WL一样强大。实验表明,我们的NC-GNNN在各种基准上取得了显著的业绩。