This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity's neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Experimental results show that HittER achieves new state-of-the-art results on multiple link prediction datasets. We additionally propose a simple approach to integrate HittER into BERT and demonstrate its effectiveness on two Freebase factoid question answering datasets.
翻译:本文探讨了在复杂的多关系知识图中学习实体和关系的实体和关系表述这一具有挑战性的问题。 我们提议Hitter, 这是一种等级变异模型, 以基于源实体的邻居为基础, 共同学习实体关系构成和关系背景。 我们提议的模型由两个不同的变异方块组成: 底块提取源实体当地邻居中每个实体- 关系配对的特征, 顶层块汇总来自底块输出的关联信息。 我们进一步设计了一个蒙面实体预测任务, 以平衡来自关系环境和来源实体本身的信息。 实验结果显示, Hitter在多个链接预测数据集上取得了新的最新结果。 我们还提议了一个简单的方法, 将赫特尔纳入德国应急小组, 并展示其在两个自由基事实回答数据集上的有效性 。