Representation learning for knowledge graphs (KGs) has focused on the problem of answering simple link prediction queries. In this work we address the more ambitious challenge of predicting the answers of conjunctive queries with multiple missing entities. We propose Bi-Directional Query Embedding (BIQE), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms. Contrary to prior work, bidirectional self-attention can capture interactions among all the elements of a query graph. We introduce a new dataset for predicting the answer of conjunctive query and conduct experiments that show BIQE significantly outperforming state of the art baselines.
翻译:知识图表(KGs)的代表性学习侧重于回答简单链接预测询问的问题。在这项工作中,我们应对了预测与多个缺失实体的连带查询答案这一更为艰巨的挑战。我们建议采用双对调查询嵌入法(BIQE),这种方法将连带查询与基于双向关注机制的模型嵌入。与以往的工作相反,双向自我关注可以捕捉查询图表所有要素之间的相互作用。我们引入了一个新的数据集,用于预测连接查询和实验的答案,显示 BIQE 显著优于艺术基线的状态。