Large scale knowledge graph embedding has attracted much attention from both academia and industry in the field of Artificial Intelligence. However, most existing methods concentrate solely on fact triples contained in the given knowledge graph. Inspired by the fact that logic rules can provide a flexible and declarative language for expressing rich background knowledge, it is natural to integrate logic rules into knowledge graph embedding, to transfer human knowledge to entity and relation embedding, and strengthen the learning process. In this paper, we propose a novel logic rule-enhanced method which can be easily integrated with any translation based knowledge graph embedding model, such as TransE . We first introduce a method to automatically mine the logic rules and corresponding confidences from the triples. And then, to put both triples and mined logic rules within the same semantic space, all triples in the knowledge graph are represented as first-order logic. Finally, we define several operations on the first-order logic and minimize a global loss over both of the mined logic rules and the transformed first-order logics. We conduct extensive experiments for link prediction and triple classification on three datasets: WN18, FB166, and FB15K. Experiments show that the rule-enhanced method can significantly improve the performance of several baselines. The highlight of our model is that the filtered Hits@1, which is a pivotal evaluation in the knowledge inference task, has a significant improvement (up to 700% improvement).
翻译:大规模知识图嵌入在人工智能领域引起了学术界和产业界的极大关注。然而,大多数现有方法完全集中于特定知识图中包含的三重事实。由于逻辑规则能够为表达丰富的背景知识提供灵活和宣示性的语言,因此逻辑规则自然会纳入知识图嵌入,将人类知识转移给实体和关系嵌入,并加强学习过程。在本文件中,我们提出了一个新的逻辑强化规则方法,可以很容易地与任何基于翻译的知识图嵌入模型(如TransE)相结合。我们首先引入了一种方法,自动地将逻辑规则和三重信任埋入三重。随后,为了在同一语义空间内设置三重和开源的逻辑规则,所有知识图中的三重逻辑规则都作为一级逻辑嵌入,将人类知识嵌入到实体和关系嵌入,并加强学习过程。最后,我们确定了几个关于一级逻辑逻辑的操作,并最大限度地减少全球在模型逻辑规则和转换的第一阶逻辑中的损失。我们进行了广泛的实验,将三套数据集的预测和三重分类:WN18、FB166、以及FB15K的改进方法都大大地展示了我们的业绩基准。