Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning. However, GCN does not perform well on sparsely-labeled graphs. Its two-layer version cannot effectively propagate the label information to the whole graph structure (i.e., the under-smoothing problem) while its deep version over-smoothens and is hard to train (i.e., the over-smoothing problem). To solve these two issues, we propose a new graph neural network called GND-Nets (for Graph Neural Diffusion Networks) that exploits the local and global neighborhood information of a vertex in a single layer. Exploiting the shallow network mitigates the over-smoothing problem while exploiting the local and global neighborhood information mitigates the under-smoothing problem. The utilization of the local and global neighborhood information of a vertex is achieved by a new graph diffusion method called neural diffusions, which integrate neural networks into the conventional linear and nonlinear graph diffusions. The adoption of neural networks makes neural diffusions adaptable to different datasets. Extensive experiments on various sparsely-labeled graphs verify the effectiveness and efficiency of GND-Nets compared to state-of-the-art approaches.
翻译:GCN是一个基于图形的半监督学习的开拓型模型。 然而, GCN 并不在少贴标签的图形上表现良好。 它的两层版本无法将标签信息有效传播到整个图形结构( 即下吸附问题 ), 而其深版的超吸附问题 也难以培训( 过度吸附问题 ) 。 为了解决这两个问题, 我们提议了一个新的图形神经网络( GND- Net ( 图形神经扩散网络 ), 以利用单层一个顶点的本地和全球周边信息。 利用浅端网络来缓解过度吸附问题, 同时利用本地和全球周边信息缓解了下吸附问题 。 使用本地和全球周边的顶点信息是通过一种叫做神经传播的新图形传播方法实现的, 将神经网络( GND-Net 网络) 整合到常规的线性和非线性图表传播中。 利用浅色网络来对各种图像的可变现性数据率进行测试。 采用神经网络, 将模型的可变式数据转换到不同状态。