Graph neural networks (GNNs) with missing node features have recently received increasing interest. Such missing node features seriously hurt the performance of the existing GNNs. Some recent methods have been proposed to reconstruct the missing node features by the information propagation among nodes with known and unknown attributes. Although these methods have achieved superior performance, how to exactly exploit the complex data correlations among nodes to reconstruct missing node features is still a great challenge. To solve the above problem, we propose a self-supervised guided hypergraph feature propagation (SGHFP). Specifically, the feature hypergraph is first generated according to the node features with missing information. And then, the reconstructed node features produced by the previous iteration are fed to a two-layer GNNs to construct a pseudo-label hypergraph. Before each iteration, the constructed feature hypergraph and pseudo-label hypergraph are fused effectively, which can better preserve the higher-order data correlations among nodes. After then, we apply the fused hypergraph to the feature propagation for reconstructing missing features. Finally, the reconstructed node features by multi-iteration optimization are applied to the downstream semi-supervised classification task. Extensive experiments demonstrate that the proposed SGHFP outperforms the existing semi-supervised classification with missing node feature methods.
翻译:缺少节点特征的图形神经网络(GNN)最近受到越来越多的关注。这些缺失的节点特征严重伤害了现有GNN的性能。一些最近的方法建议通过在已知和未知属性的节点之间传播信息来重建缺失的节点特征。虽然这些方法已经取得了优异的性能,但如何精确利用节点之间的复杂数据关系重建缺失节点特征仍是一个巨大的挑战。为了解决上述问题,我们提议了一种自监管的引导高频特征传播(SGHFP) 。具体地说,特效高光谱首先根据缺少信息的节点特征生成。然后,将先前迭代生成的重现节点特征输入到一个双层 GNNNNP,以构建一个假标签高亮的超光谱。在每次迭代之前,将构建的特效高光谱和伪标签高光谱特征连接起来,这样可以更好地保护各节点之间的高端数据相关性数据相关性。随后,我们将导高光度高光度高光度高光谱特征传播用于重建缺失的特征。最后,通过多端点点定位的重定的节点实验将多端节点点调整的节点图像优化的节点图像优化应用演示应用演示应用到下,以显示现有的超高频定的底定方法将SGHGHPFPS-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-GM-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-