Information over-squashing is a phenomenon of inefficient information propagation between distant nodes on networks. It is an important problem that is known to significantly impact the training of graph neural networks (GNNs), as the receptive field of a node grows exponentially. To mitigate this problem, a preprocessing procedure known as rewiring is often applied to the input network. In this paper, we investigate the use of discrete analogues of classical geometric notions of curvature to model information flow on networks and rewire them. We show that these classical notions achieve state-of-the-art performance in GNN training accuracy on a variety of real-world network datasets. Moreover, compared to the current state-of-the-art, these classical notions exhibit a clear advantage in computational runtime by several orders of magnitude.
翻译:信息过度夸大是一种在网络上的遥远节点之间传播信息效率低下的现象,这是一个众所周知的重要问题,对图形神经网络(GNNs)的培训产生重大影响,因为一个节点的可接受领域成倍增长。为了缓解这一问题,通常对输入网络采用被称为重新连接的预处理程序。在本文中,我们调查了使用经典曲率概念的离散类比来模拟网络信息流动并重新连接这些网络。我们表明这些古典概念在GNN培训中取得了对各种真实世界网络数据集的最新准确性。此外,与目前的最新技术相比,这些传统概念在计算运行时明显具有几级的优势。