Predicting missing links between entities in a knowledge graph is a fundamental task to deal with the incompleteness of data on the Web. Knowledge graph embeddings map nodes into a vector space to predict new links, scoring them according to geometric criteria. Relations in the graph may follow patterns that can be learned, e.g., some relations might be symmetric and others might be hierarchical. However, the learning capability of different embedding models varies for each pattern and, so far, no single model can learn all patterns equally well. In this paper, we combine the query representations from several models in a unified one to incorporate patterns that are independently captured by each model. Our combination uses attention to select the most suitable model to answer each query. The models are also mapped onto a non-Euclidean manifold, the Poincar\'e ball, to capture structural patterns, such as hierarchies, besides relational patterns, such as symmetry. We prove that our combination provides a higher expressiveness and inference power than each model on its own. As a result, the combined model can learn relational and structural patterns. We conduct extensive experimental analysis with various link prediction benchmarks showing that the combined model outperforms individual models, including state-of-the-art approaches.
翻译:知识图形将地图节点嵌入矢量空间,以预测新的链接,并按几何标准评分。图形中的关系可能遵循可以学习的模式,例如,某些关系可能是对称的,而其他关系可能是分级的。然而,不同嵌入模型的学习能力因每个模式而异,到目前为止,没有任何单一模型能够同样地了解所有模式。在本文中,我们将几个模型的查询表达方式合并为一个统一的模型,以纳入每个模型独立收集的模式。我们的组合利用注意力选择最适合的模型来回答每个查询。这些模型还被映射到非欧克莱德式的方程式,即波因卡勒球上,以捕捉结构模式,例如等级结构,除了对称等关系模式之外,等级结构。我们证明我们的组合比每个模型本身的表达和推断能力要高。结果是,联合模型可以学习每个模型和结构模式独立收集的关系和结构模式。我们还在非欧克莱德式模型上进行广泛的实验性分析,包括各种联系基准。我们用各种联系模型进行广泛的实验性分析。