Convolutional layers within graph neural networks operate by aggregating information about local neighbourhood structures; one common way to encode such substructures is through random walks. The distribution of these random walks evolves according to a diffusion equation defined using the graph Laplacian. We extend this approach by leveraging classic mathematical results about hypo-elliptic diffusions. This results in a novel tensor-valued graph operator, which we call the hypo-elliptic graph Laplacian. We provide theoretical guarantees and efficient low-rank approximation algorithms. In particular, this gives a structured approach to capture long-range dependencies on graphs that is robust to pooling. Besides the attractive theoretical properties, our experiments show that this method competes with graph transformers on datasets requiring long-range reasoning but scales only linearly in the number of edges as opposed to quadratically in nodes.
翻译:图形神经网络中的演进层通过汇总关于本地邻里结构的信息来运行; 编码这类子结构的一个常见方法是随机行走。 这些随机行走的分布根据使用Laplacian 图形定义的传播方程式而演变。 我们通过利用关于低电子扩散的经典数学结果来扩展这一方法。 这导致产生了一个新的高价值图操作员, 我们称之为低电子图拉placeian。 我们提供了理论保障和高效的低级近似算法。 特别是, 这为捕捉能够集中的图表的长距离依赖性提供了结构化的方法。 除了有吸引力的理论属性外, 我们的实验显示, 这种方法在需要长距离推理的数据集上与图形变压器竞争, 而在节点中, 只需要直线的边缘数, 而不是四边。