Recent work shows that the expressive power of Graph Neural Networks (GNNs) in distinguishing non-isomorphic graphs is exactly the same as that of the Weisfeiler-Lehman (WL) graph test. In particular, they show that the WL test can be simulated by GNNs. However, those simulations involve neural networks for the 'combine' function of size polynomial or even exponential in the number of graph nodes $n$, as well as feature vectors of length linear in $n$. We present an improved simulation of the WL test on GNNs with \emph{exponentially} lower complexity. In particular, the neural network implementing the combine function in each node has only a polylogarithmic number of parameters in $n$, and the feature vectors exchanged by the nodes of GNN consists of only $O(\log n)$ bits. We also give logarithmic lower bounds for the feature vector length and the size of the neural networks, showing the (near)-optimality of our construction.
翻译:最近的工作表明,图形神经网络(GNNS)在区分非线性非线性图形中的表达力与Weisfeiler-Lehman(WL)图形测试完全相同。特别是,它们表明WL测试可以由GNNS模拟。然而,这些模拟涉及大小多面性“combine”功能的神经网络,甚至以图形点点数($)或指数数($)为单位($),以及长线性特性矢量($)。我们对GNS的WL测试进行了改进的模拟,其复杂性较低。特别是,每个节点执行组合函数的神经网络只有多面数参数($),而GNN节点交换的特性矢量只有$(\log n) 位数($) 。我们还为特性矢量矢量长度和神经网络的大小提供了对数值更低的对数框,显示我们构造的(近)-最优度。