Knowledge graphs (KGs) serve as useful resources for various natural language processing applications. Previous KG completion approaches require a large number of training instances (i.e., head-tail entity pairs) for every relation. The real case is that for most of the relations, very few entity pairs are available. Existing work of one-shot learning limits method generalizability for few-shot scenarios and does not fully use the supervisory information; however, few-shot KG completion has not been well studied yet. In this work, we propose a novel few-shot relation learning model (FSRL) that aims at discovering facts of new relations with few-shot references. FSRL can effectively capture knowledge from heterogeneous graph structure, aggregate representations of few-shot references, and match similar entity pairs of reference set for every relation. Extensive experiments on two public datasets demonstrate that FSRL outperforms the state-of-the-art.
翻译:以往的KG完成方法要求为每一关系提供大量培训实例(即对正尾实体的配对),实际情况是,大多数关系中,只有极少数对等实体可供使用。目前关于一发学习限制方法的工作对几发情景来说是通用的,没有充分利用监督信息;然而,对几发KG完成情况的研究还不够充分。在这项工作中,我们提出了一个新的微小相距关系学习模型(FSRL),目的是发现与少数点引用的新关系的事实。FSRL可以有效地从多样图表结构中获取知识,对几发参考文献进行综合展示,并对每个关系中类似的实体参照数据集进行匹配。在两个公共数据集上进行的广泛实验表明,FSRL超越了最新技术。