Rule mining is an effective approach for reasoning over knowledge graph (KG). Existing works mainly concentrate on mining rules. However, there might be several rules that could be applied for reasoning for one relation, and how to select appropriate rules for completion of different triples has not been discussed. In this paper, we propose to take the context information into consideration, which helps select suitable rules for the inference tasks. Based on this idea, we propose a transformer-based rule mining approach, Ruleformer. It consists of two blocks: 1) an encoder extracting the context information from subgraph of head entities with modified attention mechanism, and 2) a decoder which aggregates the subgraph information from the encoder output and generates the probability of relations for each step of reasoning. The basic idea behind Ruleformer is regarding rule mining process as a sequence to sequence task. To make the subgraph a sequence input to the encoder and retain the graph structure, we devise a relational attention mechanism in Transformer. The experiment results show the necessity of considering these information in rule mining task and the effectiveness of our model.
翻译:现有工程主要集中于采矿规则,但可能有若干规则可用于解释一种关系,以及如何为完成不同三重关系选择适当的规则,本文没有讨论。我们建议考虑背景信息,帮助为推论任务选择适当的规则。根据这个想法,我们建议采用基于变压器的规则采矿方法,规则规则规则。它由两个区块组成:1)一个编码器,从具有修正注意机制的实体的子集中提取背景信息,2)一个编码器,从编码器输出中汇总子信息,并产生每一推理步骤的关系概率。规则规则的基本思想是规则采矿过程作为顺序任务的一个顺序。要使分集输入编码器的顺序并保留图结构,我们在变换器中设计一个关联关注机制。实验结果显示,有必要将这些信息纳入规则采矿任务,并反映我们模型的有效性。