While many existing graph neural networks (GNNs) have been proven to perform $\ell_2$-based graph smoothing that enforces smoothness globally, in this work we aim to further enhance the local smoothness adaptivity of GNNs via $\ell_1$-based graph smoothing. As a result, we introduce a family of GNNs (Elastic GNNs) based on $\ell_1$ and $\ell_2$-based graph smoothing. In particular, we propose a novel and general message passing scheme into GNNs. This message passing algorithm is not only friendly to back-propagation training but also achieves the desired smoothing properties with a theoretical convergence guarantee. Experiments on semi-supervised learning tasks demonstrate that the proposed Elastic GNNs obtain better adaptivity on benchmark datasets and are significantly robust to graph adversarial attacks. The implementation of Elastic GNNs is available at \url{https://github.com/lxiaorui/ElasticGNN}.
翻译:虽然许多现有的图形神经网络(GNNs)已被证明能够在全球顺利地执行$@ell_2$基图的平滑图,但在这项工作中,我们的目标是通过$\ell_1$基图的平滑图进一步增强GNS的本地平稳适应性。结果,我们引入了一个基于$>1$和$>2$基图的GNNs家庭(Elastric GNS),特别是,我们提出了一个新颖和一般信息传送计划,将信息传送到GNS。这个信息传递算法不仅对后方通信培训有利,而且以理论趋同保证实现所期望的平滑性。关于半监督性学习任务的实验表明,拟议的Elastic GNNS在基准数据集上获得了更好的适应性,并且对图形对抗性攻击非常有力。 Elastic GNNS的落实情况可在url{https://github.com/lxiaorui/ElasticGNNN}查阅。