Recent years have witnessed remarkable success achieved by graph neural networks (GNNs) in many real-world applications such as recommendation and drug discovery. Despite the success, oversmoothing has been identified as one of the key issues which limit the performance of deep GNNs. It indicates that the learned node representations are highly indistinguishable due to the stacked aggregators. In this paper, we propose a new perspective to look at the performance degradation of deep GNNs, i.e., feature overcorrelation. Through empirical and theoretical study on this matter, we demonstrate the existence of feature overcorrelation in deeper GNNs and reveal potential reasons leading to this issue. To reduce the feature correlation, we propose a general framework DeCorr which can encourage GNNs to encode less redundant information. Extensive experiments have demonstrated that DeCorr can help enable deeper GNNs and is complementary to existing techniques tackling the oversmoothing issue.
翻译:近年来,在诸如建议和药物发现等许多现实世界应用中,图形神经网络(GNNs)取得了显著成功。尽管取得了成功,但过度悬浮已被确定为限制深层GNNs绩效的关键问题之一。它表明,由于堆叠的聚合器,所学到的节点表示非常难以区分。在本文件中,我们提出了一个新的视角,以审视深层GNNs的性能退化,即特征过度关联。通过对这一问题的经验和理论研究,我们证明在更深的GNNs中存在特征过度关联,并揭示了导致这一问题的潜在原因。为了减少特征相关性,我们提议了一个总框架DeCorr,鼓励GNNs对较少的冗余信息进行编码。广泛的实验表明,DeCorr可以帮助更深入GNNs,并且是对解决过度重叠问题的现有技术的补充。