Recently, subgraph GNNs have emerged as an important direction for developing expressive graph neural networks (GNNs). While numerous architectures have been proposed, so far there is still a limited understanding of how various design paradigms differ in terms of expressive power, nor is it clear what design principle achieves maximal expressiveness with minimal architectural complexity. Targeting these fundamental questions, this paper conducts a systematic study of general node-based subgraph GNNs through the lens of Subgraph Weisfeiler-Lehman Tests (SWL). Our central result is to build a complete hierarchy of SWL with strictly growing expressivity. Concretely, we prove that any node-based subgraph GNN falls into one of the six SWL equivalence classes, among which $\mathsf{SSWL}$ achieves the maximal expressive power. We also study how these equivalence classes differ in terms of their practical expressiveness such as encoding graph distance and biconnectivity. In addition, we give a tight expressivity upper bound of all SWL algorithms by establishing a close relation with localized versions of Folklore WL tests (FWL). Overall, our results provide insights into the power of existing subgraph GNNs, guide the design of new architectures, and point out their limitations by revealing an inherent gap with the 2-FWL test. Finally, experiments on the ZINC benchmark demonstrate that $\mathsf{SSWL}$-inspired subgraph GNNs can significantly outperform prior architectures despite great simplicity.
翻译:最近,子图 GNNs 已成为开发直观图形神经网络的重要方向。 虽然已经提出了许多结构, 但到目前为止,对于各种设计范式在表达力方面有何差异的理解仍然有限, 也不清楚设计原则在最小建筑复杂性下实现了最大表达力。 本文针对这些基本问题, 通过子图 Weisfeiler-Lehman 测试( SWL) 的镜头, 系统研究基于节点的一般子图 GNNs 。 我们的核心结果是建立SWL 的完整等级, 并且严格增加表达性。 具体地说, 我们证明任何基于节点的子图GNNNN( GNNN) 都属于S 6类中的一种, 其中$\ mathfsf{SWL($) 达到了最起码的表达力。 我们还研究这些等等等同值分类在实际的表达性方面有何差异。 此外, 我们给出了所有 SWL 算法的严格表达性上约束性, 通过与本地版本建立紧密的GNS(GNL) 亚缩缩图) 测试结果。