In many important graph data processing applications the acquired information includes both node features and observations of the graph topology. Graph neural networks (GNNs) are designed to exploit both sources of evidence but they do not optimally trade-off their utility and integrate them in a manner that is also universal. Here, universality refers to independence on homophily or heterophily graph assumptions. We address these issues by introducing a new Generalized PageRank (GPR) GNN architecture that adaptively learns the GPR weights so as to jointly optimize node feature and topological information extraction, regardless of the extent to which the node labels are homophilic or heterophilic. Learned GPR weights automatically adjust to the node label pattern, irrelevant on the type of initialization, and thereby guarantee excellent learning performance for label patterns that are usually hard to handle. Furthermore, they allow one to avoid feature over-smoothing, a process which renders feature information nondiscriminative, without requiring the network to be shallow. Our accompanying theoretical analysis of the GPR-GNN method is facilitated by novel synthetic benchmark datasets generated by the so-called contextual stochastic block model. We also compare the performance of our GNN architecture with that of several state-of-the-art GNNs on the problem of node-classification, using well-known benchmark homophilic and heterophilic datasets. The results demonstrate that GPR-GNN offers significant performance improvement compared to existing techniques on both synthetic and benchmark data.
翻译:在许多重要的图表数据处理应用中,获得的信息包括了图示表层学的节点特征和观测。图形神经网络(GNNS)的设计是为了利用两个证据来源,但是它们并没有最佳地交换它们的效用,并且以同样普遍的方式将其综合起来。这里,普遍性指的是单调或偏差的图解假设的独立性。我们通过引入一个新的通用的PageRank(GPR) GNN(GPR) GNN(GPR)结构来解决这些问题,该结构适应性地学习GPR的权重,以便共同优化节点特征和表层信息提取,而不论节点标签标签在多大程度上具有同性或异性。GPR(GNNNN)的权重自动调整到节点标签模式模式,与初始化类型无关,从而保证通常难以处理的标签模式的良好学习性能。此外,它允许人们避免高调高调的特征,使信息不具有分辨性,而不需要网络进行浅度。我们伴随的GNNNF方法的理论分析,通过新的合成基准模型对G的成绩进行比较,也通过使用所谓的GGRBL(G)的精确度模型,从而将G-G-G-G-S-S-S-BL)的成绩与G-S-BS-S-S-S-S-S-C-C-C-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S