Graph Neural Network (GNN) with its ability to integrate graph information has been widely used for data analyses. However, the expressive power of GNN has only been studied for graph-level tasks but not for node-level tasks, such as node classification, where one tries to interpolate missing nodal labels from the observed ones. In this paper, we study the expressive power of GNN for the said classification task, which is in essence a function interpolation problem. Explicitly, we derive the number of weights and layers needed for a GNN to interpolate a band-limited function in $\mathbb{R}^d$. Our result shows that, the number of weights needed to $\epsilon$-approximate a bandlimited function using the GNN architecture is much fewer than the best known one using a fully connected neural network (NN) - in particular, one only needs $O((\log \epsilon^{-1})^{d})$ weights using a GNN trained by $O((\log \epsilon^{-1})^{d})$ samples to $\epsilon$-approximate a discretized bandlimited signal in $\mathbb{R}^d$. The result is obtained by drawing a connection between the GNN structure and the classical sampling theorems, making our work the first attempt in this direction.
翻译:数据分析广泛使用GNN具有整合图形信息能力的表示力。然而,GNN的表示力只是用于图形级任务,而不是用于节点级任务,例如节点分类,在节点分类中,人们试图从观察到的节点标签中对缺失的节点标签进行内插。在本文中,我们研究GNN对分类任务具有的表达力,这实质上是一个函数内插问题。我们清楚地计算出GNN为内插功能所需的重量和层数,以内插功能,将一个带带的功能以$mathb{R ⁇ d$为内,但结果显示,使用GNNNN架构对缺点标签标签的加权数远远少于已知的最佳数,使用完全连通的神经网络(NNN)—— 特别是,我们只需要用GNNNNNN来计算重量,由$(logluslon_lon_R ⁇ _ld$) 和GNBAR_lational_late make roupal roup roup roupal roupal roupal roupal roupleglex mess $=Gn=glegslationslation.