Graph Neural Networks (GNNs) have attracted much attention due to their ability in learning representations from graph-structured data. Despite the successful applications of GNNs in many domains, the optimization of GNNs is less well studied, and the performance on node classification heavily suffers from the long-tailed node degree distribution. This paper focuses on improving the performance of GNNs via normalization. In detail, by studying the long-tailed distribution of node degrees in the graph, we propose a novel normalization method for GNNs, which is termed ResNorm (\textbf{Res}haping the long-tailed distribution into a normal-like distribution via \textbf{norm}alization). The $scale$ operation of ResNorm reshapes the node-wise standard deviation (NStd) distribution so as to improve the accuracy of tail nodes (\textit{i}.\textit{e}., low-degree nodes). We provide a theoretical interpretation and empirical evidence for understanding the mechanism of the above $scale$. In addition to the long-tailed distribution issue, over-smoothing is also a fundamental issue plaguing the community. To this end, we analyze the behavior of the standard shift and prove that the standard shift serves as a preconditioner on the weight matrix, increasing the risk of over-smoothing. With the over-smoothing issue in mind, we design a $shift$ operation for ResNorm that simulates the degree-specific parameter strategy in a low-cost manner. Extensive experiments have validated the effectiveness of ResNorm on several node classification benchmark datasets.
翻译:内建网络( GNNs) 因其在从图形结构数据中学习显示的能力而吸引了大量关注。 尽管GNNs在许多领域应用成功, 但GNNs的优化研究不够, 节点分类的性能也严重受到长尾节点分布的影响。 本文的重点是通过正常化改善GNs的表现。 详细来说, 通过研究图中节点的长尾分布, 我们为GNs提出了一个新的正常化方法, 叫做 ResNorm (\ textbf{Res}}) 。 我们提供了一种理论解释和经验证据, 用于理解以上 $( textbf{norm} ) 的机制。 除了通过\ textf{ nnorm} 等域的正常化分布, GNNNNNNNNNNs 优化的优化研究不够好, 节点分类的美元规模操作会改变节点标准偏差( Ntdd) 分布。 在长期调整的基底建基点分配中, 直径变法的模型分析中, 也是一种超序流流流流流流流流流的模型。