The over-smoothing problem is an obstacle of developing deep graph neural network (GNN). Although many approaches to improve the over-smoothing problem have been proposed, there is still a lack of comprehensive understanding and conclusion of this problem. In this work, we analyze the over-smoothing problem from the Markov chain perspective. We focus on message passing of GNN and first establish a connection between GNNs and Markov chains on the graph. GNNs are divided into two classes of operator-consistent and operator-inconsistent based on whether the corresponding Markov chains are time-homogeneous. Next we attribute the over-smoothing problem to the convergence of an arbitrary initial distribution to a stationary distribution. Based on this, we prove that although the previously proposed methods can alleviate over-smoothing, but these methods cannot avoid the over-smoothing problem. In addition, we give the conclusion of the over-smoothing problem in two types of GNNs in the Markovian sense. On the one hand, operator-consistent GNN cannot avoid over-smoothing at an exponential rate. On the other hand, operator-inconsistent GNN is not always over-smoothing. Further, we investigate the existence of the limiting distribution of the time-inhomogeneous Markov chain, from which we derive a sufficient condition for operator-inconsistent GNN to avoid over-smoothing. Finally, we design experiments to verify our findings. Results show that our proposed sufficient condition can effectively improve over-smoothing problem in operator-inconsistent GNN and enhance the performance of the model.
翻译:过度悬浮的问题是发展深图神经网络的障碍。 虽然提出了许多改进过度悬浮问题的办法, 但仍缺乏对这一问题的全面理解和结论。 在这项工作中, 我们从Markov 链条的角度分析过度悬浮的问题。 我们注重GNN的传递信息, 首先在图上建立GNN和Markov链条之间的联系。 全球NNN根据相应的Markov链条是否具有时间性,分为两类操作者一致和操作者不一致的问题。 我们下一步将过度悬浮的问题归因于任意初始分配的趋同性与这一问题的趋同性。 根据这一点,我们证明,虽然先前提出的方法可以缓解过缓冲问题,但这种方法无法避免图过敏问题。 此外, 我们把两种类型的GNNNNC的模型过敏性问题分为两类。 一方面, 操作者- 一致度的GNNNU的运行者对结果的趋同性进行了一致的趋同性, 我们的GNNNN的运行者最终无法避免过量的超标。 进一步的调查, 我们的GNNNN的驱动者最终无法避免过量地超标。</s>