Although theoretical properties such as expressive power and over-smoothing of graph neural networks (GNN) have been extensively studied recently, its convergence property is a relatively new direction. In this paper, we investigate the convergence of one powerful GNN, Invariant Graph Network (IGN) over graphs sampled from graphons. We first prove the stability of linear layers for general $k$-IGN (of order $k$) based on a novel interpretation of linear equivariant layers. Building upon this result, we prove the convergence of $k$-IGN under the model of \citet{ruiz2020graphon}, where we access the edge weight but the convergence error is measured for graphon inputs. Under the more natural (and more challenging) setting of \citet{keriven2020convergence} where one can only access 0-1 adjacency matrix sampled according to edge probability, we first show a negative result that the convergence of any IGN is not possible. We then obtain the convergence of a subset of IGNs, denoted as IGN-small, after the edge probability estimation. We show that IGN-small still contains function class rich enough that can approximate spectral GNNs arbitrarily well. Lastly, we perform experiments on various graphon models to verify our statements.
翻译:虽然最近广泛研究了图形神经网络(GNN)的表达力和过度移动等理论属性,但其趋同属性是一个相对新的方向。在本文中,我们调查了一个强大的GNN(Invariant Graph Network (IGN))相对于从图形中抽样的图表的趋同。我们首先根据对线性等同层的新解释,证明普通美元(按美元排序)的线性基层的稳定性。在此基础上,我们证明美元-GNG(GN-IGN20202020graphon}模式下的GNIG(GN)的趋同性趋同性,我们在此模型中获取边重重量,但对图形输入进行测量。在更自然(和更具挑战性)的设置下,人们只能根据边际概率访问0-1的相邻矩阵样本,我们首先发现一个负面结果,即任何IGN(IGN)都不可能实现趋同。我们随后获得一组IGNG(以IGN-minal-mistruews)的IG(以IG-min-minal mal mal pal pal pal pal pal pal pal pal pal) colight) colup.