Graph neural networks (GNNs) have demonstrated superior performance for semi-supervised node classification on graphs, as a result of their ability to exploit node features and topological information simultaneously. However, most GNNs implicitly assume that the labels of nodes and their neighbors in a graph are the same or consistent, which does not hold in heterophilic graphs, where the labels of linked nodes are likely to differ. Hence, when the topology is non-informative for label prediction, ordinary GNNs may work significantly worse than simply applying multi-layer perceptrons (MLPs) on each node. To tackle the above problem, we propose a new $p$-Laplacian based GNN model, termed as $^p$GNN, whose message passing mechanism is derived from a discrete regularization framework and could be theoretically explained as an approximation of a polynomial graph filter defined on the spectral domain of $p$-Laplacians. The spectral analysis shows that the new message passing mechanism works simultaneously as low-pass and high-pass filters, thus making $^p$GNNs are effective on both homophilic and heterophilic graphs. Empirical studies on real-world and synthetic datasets validate our findings and demonstrate that $^p$GNNs significantly outperform several state-of-the-art GNN architectures on heterophilic benchmarks while achieving competitive performance on homophilic benchmarks. Moreover, $^p$GNNs can adaptively learn aggregation weights and are robust to noisy edges.
翻译:图形神经网络(GNNS)显示,由于能够同时利用节点特征和地形信息,在图形上的半监督节点分类方面表现优异。然而,大多数GNNS隐含地认为,图表中节点及其邻居的标签是相同的或一致的,这在异统图中并不存在,而相连接节点的标签可能不同。因此,当表层对标签预测没有信息规范时,普通GNNS可能比简单地在每一个节点上应用多层透视器(MLPs)工作要差得多。为了解决上述问题,我们提议一个新的基于GNNNNN的美元-Laplacecian模型,称为$p$GNNNN,其信息传递机制来自离散的规范化框架,理论上可以解释为在美元-Laplacecional域定义的光谱域域域域域上,普通GNBR-MERG(美元)基准新信息传递机制同时作为低端和高端透度透射器的过滤器。因此,GNBNURG(美元-G)的Siral-ralGNBILG) IMS-ralGS-ral-ral-ralGS-ralGS-ral-ral-ralGNS-ral-ral-ral-ral-ral-S-S-S-ral-S-S-S-S-S-S-ral-ral-ral-ral-S-ral-S-sal-sal-ral-ral-ral-S-S-s-s-s-s-s-ral-s-s-s-s-S-s-s-s-s-S-S-S-S-S-sal-sal-S-S-S-S-S-S-S-Supal-sal-sal-sal-sal-Ial-Ial-S-s-s-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S