Graph neural networks (GNNs) are important tools for transductive learning tasks, such as node classification in graphs, due to their expressive power in capturing complex interdependency between nodes. To enable graph neural network learning, existing works typically assume that labeled nodes, from two or multiple classes, are provided, so that a discriminative classifier can be learned from the labeled data. In reality, this assumption might be too restrictive for applications, as users may only provide labels of interest in a single class for a small number of nodes. In addition, most GNN models only aggregate information from short distances (e.g., 1-hop neighbors) in each round, and fail to capture long distance relationship in graphs. In this paper, we propose a novel graph neural network framework, long-short distance aggregation networks (LSDAN), to overcome these limitations. By generating multiple graphs at different distance levels, based on the adjacency matrix, we develop a long-short distance attention model to model these graphs. The direct neighbors are captured via a short-distance attention mechanism, and neighbors with long distance are captured by a long distance attention mechanism. Two novel risk estimators are further employed to aggregate long-short-distance networks, for PU learning and the loss is back-propagated for model learning. Experimental results on real-world datasets demonstrate the effectiveness of our algorithm.
翻译:心电图网络( GNNS) 是传输学习任务的重要工具, 如图表中的节点分类, 因为它在捕捉结结点之间的复杂相互依存关系方面表现了超强。 为了能够进行图形神经网络学习, 现有的工作通常假定提供了两个或多个类的标签结点, 以便从标签数据中学习有区别的分类器 。 在现实中, 这个假设可能太限制应用, 因为用户可能只为少数节点提供单类中感兴趣的标签 。 此外, 大多数 GNNM 模型只是从短距离( 例如, 1- hop 邻居) 收集信息, 并且无法在图表中捕捉到长距离关系。 在本文中, 我们提出一个新的图形神经网络框架, 长距离汇总网络( LSSN), 以克服这些限制。 通过在不同距离生成多个图表, 以模型矩阵为基础, 我们开发了一个长距离关注模型模型模型模型。 直接邻居通过短距离关注机制( 如 1- hop 邻居) 和长距离数据流流路路的学习机制, 将显示为二远距离的远程学习损失机制。