An enhanced label propagation (LP) method called GraphHop has been proposed recently. It outperforms graph convolutional networks (GCNs) in the semi-supervised node classification task on various networks. Although the performance of GraphHop was explained intuitively with joint node attributes and labels smoothening, its rigorous mathematical treatment is lacking. In this paper, new insights into GraphHop are provided by analyzing it from a constrained optimization viewpoint. We show that GraphHop offers an alternate optimization to a certain regularization problem defined on graphs. Based on this interpretation, we propose two ideas to improve GraphHop furthermore, which leads to GraphHop++. We conduct extensive experiments to demonstrate the effectiveness and efficiency of GraphHop++. It is observed that GraphHop++ outperforms all other benchmarking methods, including GraphHop, consistently on five test datasets as well as an object recognition task at extremely low label rates (i.e., 1, 2, 4, 8, 16, and 20 labeled samples per class).
翻译:最近提出了名为 GraphHop 的强化标签传播方法。 它表现在半监督节点分类任务中, 优于半监督的图形相变网络( GCNs ) 。 虽然GreaphHop 的性能是用联合节点属性和标签平滑的直观解释的, 但它缺乏严格的数学处理方法 。 在本文中, 从限制优化的角度分析GreaphHop 对GreaphHop 的新洞见是通过分析它提供的 。 我们显示GreaphHop 为图表定义的某个正规化问题提供了另一种优化。 基于这一解释, 我们提出了两个想法来进一步改进图形Hop, 从而导致GreaphHop+++。 我们进行了广泛的实验, 以展示GreaphHop+++ 的有效性和效率 。 人们观察到, GreaphHop++ 超越了所有其他基准方法, 包括GreaphHop, 持续地以极低的标签率( i. 2, 2, 4, 8, 8, 16 和 20 标签样本 ) 。