Knowledge graphs (KGs) are the key components of various natural language processing applications. To further expand KGs' coverage, previous studies on knowledge graph completion usually require a large number of training instances for each relation. However, we observe that long-tail relations are actually more common in KGs and those newly added relations often do not have many known triples for training. In this work, we aim at predicting new facts under a challenging setting where only one training instance is available. We propose a one-shot relational learning framework, which utilizes the knowledge extracted by embedding models and learns a matching metric by considering both the learned embeddings and one-hop graph structures. Empirically, our model yields considerable performance improvements over existing embedding models, and also eliminates the need of re-training the embedding models when dealing with newly added relations.
翻译:知识图表(KGs)是各种自然语言处理应用程序的关键组成部分。为了进一步扩大KGs的覆盖面,以往关于知识图表完成情况的研究通常要求为每个关系提供大量培训。然而,我们注意到,长尾关系实际上在KGs更为常见,而那些新增加的关系往往没有许多已知的培训三重关系。在这项工作中,我们的目标是在一个只有一个培训实例的富有挑战性的环境中预测新的事实。我们建议了一个一线关系学习框架,利用通过嵌入模型获得的知识,并通过考虑学习的嵌入模型和一线图结构来学习匹配的衡量标准。 偶然地说,我们的模型比现有的嵌入模型产生相当大的性能改进,也消除了在处理新增加的关系时对嵌入模型进行再培训的需要。