The performance of GNNs degrades as they become deeper due to the over-smoothing. Among all the attempts to prevent over-smoothing, residual connection is one of the promising methods due to its simplicity. However, recent studies have shown that GNNs with residual connections only slightly slow down the degeneration. The reason why residual connections fail in GNNs is still unknown. In this paper, we investigate the forward and backward behavior of GNNs with residual connections from a novel path decomposition perspective. We find that the recursive aggregation of the median length paths from the binomial distribution of residual connection paths dominates output representation, resulting in over-smoothing as GNNs go deeper. Entangled propagation and weight matrices cause gradient smoothing and prevent GNNs with residual connections from optimizing to the identity mapping. Based on these findings, we present a Universal Deep GNNs (UDGNN) framework with cold-start adaptive residual connections (DRIVE) and feedforward modules. Extensive experiments demonstrate the effectiveness of our method, which achieves state-of-the-art results over non-smooth heterophily datasets by simply stacking standard GNNs.
翻译:GNN的性能由于过度移动而变得更深。在所有防止过度移动的尝试中,剩余连接是因其简单性而有希望的方法之一。然而,最近的研究显示,具有剩余连接的GNNN的性能只是略微放慢了退化速度。 GNN的剩余连接失效的原因仍然不清楚。在本文件中,我们从新路径分解角度对具有剩余连接的GNN的前向和后向行为进行了调查。我们发现,从剩余连接路径的二进制分布中中中中中中位长度路径的递归合并控制了输出代表制,导致GNNNS越深,导致超音速。缠绕的传播和重量矩阵导致梯度滑动,防止具有剩余连接的GNNNN无法优化身份绘图。根据这些调查结果,我们提出了一个通用的LEGNNN(UGGGNNN)框架,具有冷启动的适应性剩余连接(DVIVE)和反馈式前方模块。广泛的实验表明我们的方法的有效性,通过不光速的GNNPH数据系统实现状态的结果。