Graph Neural Networks (GNNs) have achieved great success in various graph mining tasks.However, drastic performance degradation is always observed when a GNN is stacked with many layers. As a result, most GNNs only have shallow architectures, which limits their expressive power and exploitation of deep neighborhoods.Most recent studies attribute the performance degradation of deep GNNs to the \textit{over-smoothing} issue. In this paper, we disentangle the conventional graph convolution operation into two independent operations: \textit{Propagation} (\textbf{P}) and \textit{Transformation} (\textbf{T}).Following this, the depth of a GNN can be split into the propagation depth ($D_p$) and the transformation depth ($D_t$). Through extensive experiments, we find that the major cause for the performance degradation of deep GNNs is the \textit{model degradation} issue caused by large $D_t$ rather than the \textit{over-smoothing} issue mainly caused by large $D_p$. Further, we present \textit{Adaptive Initial Residual} (AIR), a plug-and-play module compatible with all kinds of GNN architectures, to alleviate the \textit{model degradation} issue and the \textit{over-smoothing} issue simultaneously. Experimental results on six real-world datasets demonstrate that GNNs equipped with AIR outperform most GNNs with shallow architectures owing to the benefits of both large $D_p$ and $D_t$, while the time costs associated with AIR can be ignored.
翻译:内建网络( GNN) 在各种图形采矿任务中取得了巨大成功 。 然而, 当 GNN 堆积多层时, 总是观察到性能急剧退化 。 结果, 大多数 GNN 只能有浅的建筑, 限制了它们的表情力和对深邻的开发。 最近的研究将深GNN 的性能退化归因于 textit{ 超时拍} 问题。 在本文中, 我们将常规图解的演化操作分解成两个独立的操作 :\ textit{ Propagation} (\ textbf{P}) 和\ textit{ Transf} (\ textbf{T} ) 。 在此之后, GNNNNN的深度可以分割成传播深度 (D_ p$) 和变深的深度 。 通过广泛的实验, 我们发现, 深GNNNNNN的性能退化的主要原因是由大D_ text} 而不是由我们翻的 美元问题导致的 IM_ dalal_ demal_ dal_ dal_ dal_ dalmas the mod modal 和所有的GND modal_ demal_ dal_ dalmas moal_ dal_ dal_ disal_ disal_ disaldald mods modaldaldal_ disalds modal_ disalds modal_ mods mods modaldaldaldald mods modald mod mod mas modal_ modalds modaldaldaldaldald modals mas modals modals mods mod modals mods modaldaldald mod mods mod mod mod mod mod mod mod modals modals mod mod mod mod mod mos mods mods