Despite the recent success of graph neural networks (GNN), common architectures often exhibit significant limitations, including sensitivity to oversmoothing, long-range dependencies, and spurious edges, e.g., as can occur as a result of graph heterophily or adversarial attacks. To at least partially address these issues within a simple transparent framework, we consider a new family of GNN layers designed to mimic and integrate the update rules of two classical iterative algorithms, namely, proximal gradient descent and iterative reweighted least squares (IRLS). The former defines an extensible base GNN architecture that is immune to oversmoothing while nonetheless capturing long-range dependencies by allowing arbitrary propagation steps. In contrast, the latter produces a novel attention mechanism that is explicitly anchored to an underlying end-to-end energy function, contributing stability with respect to edge uncertainty. When combined we obtain an extremely simple yet robust model that we evaluate across disparate scenarios including standardized benchmarks, adversarially-perturbated graphs, graphs with heterophily, and graphs involving long-range dependencies. In doing so, we compare against SOTA GNN approaches that have been explicitly designed for the respective task, achieving competitive or superior node classification accuracy. Our code is available at https://github.com/FFTYYY/TWIRLS.
翻译:尽管图表神经网络(GNN)最近取得了成功,但共同结构往往表现出巨大的局限性,包括对过度移动、远距离依赖性和虚假边缘的敏感性,例如,由于图形偏差或对抗性攻击而可能产生的过度移动、远距离依赖性和虚假边缘的敏感性。至少部分地在简单透明的框架内解决这些问题,我们认为,GNN层次的新组合旨在模仿和整合两种经典迭代算法(即近似梯度梯度下降和迭代再加权最低方(IRLS))的更新规则。前者界定了一种可扩展的GNN结构基础结构,这种结构通过允许任意传播步骤而避免过度移动,同时捕捉长期依赖性。相比之下,后者产生了一种新的关注机制,明确以基本的端对端能源功能为基础,有助于边缘不确定性的稳定。当我们获得一个非常简单而有力的模型,用以评估各种不同情景,包括标准化基准、对抗性调整式的图表、与高偏差的图形,以及涉及远程依赖性的图表,同时通过任意的传播步骤捕捉捉到远程依赖性。相比之下,后者产生了一种新的关注机制,我们所设计的高端/高端规则。