Graph data is omnipresent and has a large variety of applications such as natural science, social networks or semantic web. Though rich in information, graphs are often noisy and incomplete. Therefore, graph completion tasks such as node classification or link prediction have gained attention. On the one hand, neural methods such as graph neural networks have proven to be robust tools for learning rich representations of noisy graphs. On the other hand, symbolic methods enable exact reasoning on graphs. We propose KeGNN, a neuro-symbolic framework for learning on graph data that combines both paradigms and allows for the integration of prior knowledge into a graph neural network model. In essence, KeGNN consists of a graph neural network as a base on which knowledge enhancement layers are stacked with the objective of refining predictions with respect to prior knowledge. We instantiate KeGNN in conjunction with two standard graph neural networks: Graph Convolutional Networks and Graph Attention Networks, and evaluate KeGNN on multiple benchmark datasets for node classification.
翻译:图数据无处不在,具有广泛的应用,如自然科学、社交网络或语义网。虽然富含信息,但图通常会存在噪声和不完整性。因此,图完成任务,如节点分类或链接预测,已引起关注。一方面,神经方法,如图神经网络,已经被证明是学习噪声图的丰富表示的强大工具。另一方面,符号方法使得对图进行精确推理成为可能。我们提出了KeGNN,这是一个神经符号框架,用于在图数据上进行学习,它结合了两种范例,并允许将先前知识集成到图神经网络模型中。实质上,KeGNN由一个图神经网络基础构成,在其上叠加知识增强层,目的是相对于先前知识来改进预测。我们使用两种标准图神经网络:图卷积网络和图自注意力网络来实例化KeGNN,并对多个节点分类的基准数据集进行评估。