Node features of graph neural networks (GNNs) tend to become more similar with the increase of the network depth. This effect is known as over-smoothing, which we axiomatically define as the exponential convergence of suitable similarity measures on the node features. Our definition unifies previous approaches and gives rise to new quantitative measures of over-smoothing. Moreover, we empirically demonstrate this behavior for several over-smoothing measures on different graphs (small-, medium-, and large-scale). We also review several approaches for mitigating over-smoothing and empirically test their effectiveness on real-world graph datasets. Through illustrative examples, we demonstrate that mitigating over-smoothing is a necessary but not sufficient condition for building deep GNNs that are expressive on a wide range of graph learning tasks. Finally, we extend our definition of over-smoothing to the rapidly emerging field of continuous-time GNNs.
翻译:翻译后的标题:
图神经网络中过度平滑现象综述
翻译后的摘要:
图神经网络(GNN)节点特征随着网络深度增加越来越相似,这个现象被称为过度平滑(Oversmoothing)。本文公理化地给出过度平滑的定义:指数收敛于适当的相似性度量。文中的定义统一了之前的方法,提出了新的量化指标。本文还对中、小、大规模的几种图上的过平滑量的方法进行实证分析,回顾了几种缓解过度平滑的方法,并对实际的图数据集进行了实验测试。通过例证,我们展示了缓解过度平滑是构建深层GNN以在广泛的图学习任务上表现出表达力所必需的,但并不充分的条件。最后,我们将我们过度平滑的定义扩展到快速兴起的连续时间GNN领域。