Graph Neural Networks (GNN) are inherently limited in their expressive power. Recent seminal works (Xu et al., 2019; Morris et al., 2019b) introduced the Weisfeiler-Lehman (WL) hierarchy as a measure of expressive power. Although this hierarchy has propelled significant advances in GNN analysis and architecture developments, it suffers from several significant limitations. These include a complex definition that lacks direct guidance for model improvement and a WL hierarchy that is too coarse to study current GNNs. This paper introduces an alternative expressive power hierarchy based on the ability of GNNs to calculate equivariant polynomials of a certain degree. As a first step, we provide a full characterization of all equivariant graph polynomials by introducing a concrete basis, significantly generalizing previous results. Each basis element corresponds to a specific multi-graph, and its computation over some graph data input corresponds to a tensor contraction problem. Second, we propose algorithmic tools for evaluating the expressiveness of GNNs using tensor contraction sequences, and calculate the expressive power of popular GNNs. Finally, we enhance the expressivity of common GNN architectures by adding polynomial features or additional operations / aggregations inspired by our theory. These enhanced GNNs demonstrate state-of-the-art results in experiments across multiple graph learning benchmarks.
翻译:内建图网络( GNN) 具有内在的表达力。 最近的原始著作( Xu 等人, 2019年; Morris 等人, 2019年b) 引入了 Weisfeiler- Lehman (WL) 等级, 作为表达力的衡量尺度。 虽然这一等级在GNN 分析和架构发展方面推动了大量进步, 但它受到若干重大限制。 其中包括缺乏模型改进直接指导的复杂定义, WL 等级过于粗糙, 无法研究当前的 GNNs 。 本文根据 GNNs 计算某种程度的等离异性多边多数值的能力, 引入了Weisfeiler- Lehman (WL) 等级作为表达力的衡量尺度。 虽然这一等级在GFefeiler- Lehman (WL) 等级中推动了所有等异性图形多数值多数值结构的完整特征, 大大概括了以前的结果。 每个基础元素与具体的多数面图相对应, 其计算一些图形数据输入与索尔收缩问题相对。 其次, 我们提出算工具来评估GNNNNNNP 的显示G- gNNNS 基础的显性更高级的模型。 最后, 我们通过G- gNNNNNNS 高级的演示的模型增加了的普通的多数级的演示的模型的模型的演示的演示的模型, 。