Graph neural network (GNN) and label propagation algorithm (LPA) are both message passing algorithms, which have achieved superior performance in semi-supervised classification. GNN performs feature propagation by a neural network to make predictions, while LPA uses label propagation across graph adjacency matrix to get results. However, there is still no good way to combine these two kinds of algorithms. In this paper, we proposed a new Unified Message Passaging Model (UniMP) that can incorporate feature propagation and label propagation with a shared message passing network, providing a better performance in semi-supervised classification. First, we adopt a Graph Transformer jointly label embedding to propagate both the feature and label information. Second, to train UniMP without overfitting in self-loop label information, we propose a masked label prediction strategy, in which some percentage of training labels are simply masked at random, and then predicted. UniMP conceptually unifies feature propagation and label propagation and be empirically powerful. It obtains new state-of-the-art semi-supervised classification results in Open Graph Benchmark (OGB). Our implementation is available online https://github.com/PaddlePaddle/PGL/tree/main/ogb_examples/nodeproppred/unimp.
翻译:图像神经网络( GNN) 和标签传播算法( LPA) 是信息传递算法, 在半监督分类中取得了优异的性能。 GNN 由神经网络进行特征传播, 以作出预测, 而 LPA 则使用图示相邻矩阵的标签传播来获得结果。 但是, 仍然没有好的办法将这两种算法结合起来。 在本文中, 我们提出了一个新的统一信息传递模型( UniMP ), 可以将特征传播和标签传播与共享信息传递网络结合起来, 在半监督分类中提供更好的性能。 首先, 我们采用一个图形变异器联合标签嵌入, 以传播功能和标签信息。 其次, 要在不过度配置自贴标签信息的情况下培训 Unim MP, 我们提出一个掩码标签预测战略, 其中某些比例的培训标签只是随机遮蔽的, 然后预测。 UniMP 在概念上将特征传播和标签传播统一, 并且具有经验性强力。 它在 Opengraphal im- the- art id- id- superocedced cloviced cal 分类结果结果结果。 在 Opreal bal 基准( OG) 。 我们的实施工作是在线的 http://plemental/ deplemental/ deplemental/ developlemental/ deplemental/ depal/ depalbdpal_ propropmentalpalpropal_ propropmental_ propropmentalmentalmentalmentalmentalmental_ progmentalmental_ premental_ palbdrogment_progment_ preprement_ prement.