Knowledge graph completion aims to address the problem of extending a KG with missing triples. In this paper, we provide an approach GenKGC, which converts knowledge graph completion to sequence-to-sequence generation task with the pre-trained language model. We further introduce relation-guided demonstration and entity-aware hierarchical decoding for better representation learning and fast inference. Experimental results on three datasets show that our approach can obtain better or comparable performance than baselines and achieve faster inference speed compared with previous methods with pre-trained language models. We also release a new large-scale Chinese knowledge graph dataset AliopenKG500 for research purpose. Code and datasets are available in https://github.com/zjunlp/PromptKG/tree/main/GenKGC.
翻译:知识图的完成旨在解决扩展一个缺少三重语言的KG的问题。在本文件中,我们提供了一种GenKGC方法,将知识图的完成转换成与预先培训的语言模型相继生成的任务。我们进一步引入了关系引导示范和实体意识等级解码,以更好地进行代表性学习和快速推断。三个数据集的实验结果表明,我们的方法可以比基线取得更好或可比的性能,并比以前培训前语言模型的方法更快地实现推断速度。我们还发布了一个新的大规模中国知识图表数据集AliopenKG500,用于研究目的。代码和数据集见https://github.com/zjunp/PromptKG/tree/main/GenKGC。</s>