Knowledge graph (KG) embeddings learn low-dimensional representations of entities and relations to predict missing facts. KGs often exhibit hierarchical and logical patterns which must be preserved in the embedding space. For hierarchical data, hyperbolic embedding methods have shown promise for high-fidelity and parsimonious representations. However, existing hyperbolic embedding methods do not account for the rich logical patterns in KGs. In this work, we introduce a class of hyperbolic KG embedding models that simultaneously capture hierarchical and logical patterns. Our approach combines hyperbolic reflections and rotations with attention to model complex relational patterns. Experimental results on standard KG benchmarks show that our method improves over previous Euclidean- and hyperbolic-based efforts by up to 6.1% in mean reciprocal rank (MRR) in low dimensions. Furthermore, we observe that different geometric transformations capture different types of relations while attention-based transformations generalize to multiple relations. In high dimensions, our approach yields new state-of-the-art MRRs of 49.6% on WN18RR and 57.7% on YAGO3-10.
翻译:嵌入知识图形( KG) 学习了低维的实体和关系表达方式, 以预测缺失的事实。 KG 通常会显示在嵌入空间中必须保存的等级和逻辑模式。 对于等级数据来说, 双曲嵌入方法显示了高不忠和低端表达方式的希望。 但是, 现有的双曲嵌入方法并不反映KGs 中丰富的逻辑模式。 在这项工作中, 我们引入了一组双偏 KG 嵌入模式, 同时捕捉到等级和逻辑模式。 我们的方法将双曲反射和旋转与关注模型复杂关系模式结合起来。 KG 标准基准的实验结果表明,我们的方法比先前的Euclidean 和超双曲基努力改进了6.1%的低维度。 此外, 我们观察到, 不同的几何转换方式可以捕捉到不同类型的关系, 而基于关注的转换则会概括到多个关系。 在高维度方面, 我们的方法产生新的状态的MRRR, 在WN18RR 和YAGO3- 10 上产生57. 7% 。