The quality of signal propagation in message-passing graph neural networks (GNNs) strongly influences their expressivity as has been observed in recent works. In particular, for prediction tasks relying on long-range interactions, recursive aggregation of node features can lead to an undesired phenomenon called "oversquashing". We present a framework for analyzing oversquashing based on information contraction. Our analysis is guided by a model of reliable computation due to von Neumann that lends a new insight into oversquashing as signal quenching in noisy computation graphs. Building on this, we propose a graph rewiring algorithm aimed at alleviating oversquashing. Our algorithm employs a random local edge flip primitive motivated by an expander graph construction. We compare the spectral expansion properties of our algorithm with that of an existing curvature-based non-local rewiring strategy. Synthetic experiments show that while our algorithm in general has a slower rate of expansion, it is overall computationally cheaper, preserves the node degrees exactly and never disconnects the graph.
翻译:信息传递图形神经网络(GNNS)中的信号传播质量强烈地影响其表达性,正如最近的工作所观察到的。特别是,对于依赖远程互动的预测任务,对节点特性的循环组合可能导致一个不理想的现象,称为“超振 ” 。我们提出了一个基于信息收缩分析超压的框架。我们的分析以冯纽曼(Von Neumann)的可靠计算模型为指南,该模型为过度配置提供了新的洞察力,将信号解压缩成噪音计算图。基于这一点,我们建议用图表重新连线算法,旨在缓解高振荡。我们的算法采用了一种随机的本地边缘,由扩大的图形构造驱动的原始边缘。我们比较了我们算法的光谱扩展特性与现有的基于曲线的非本地再接线战略的特性。合成实验显示,虽然我们总体的算法扩张速度较慢,但总体计算更便宜,保留节点度,并且从未断开图表。