We present a novel method for relation extraction (RE) from a single sentence, mapping the sentence and two given entities to a canonical fact in a knowledge graph (KG). Especially in this presumed sentential RE setting, the context of a single sentence is often sparse. This paper introduces the KGPool method to address this sparsity, dynamically expanding the context with additional facts from the KG. It learns the representation of these facts (entity alias, entity descriptions, etc.) using neural methods, supplementing the sentential context. Unlike existing methods that statically use all expanded facts, KGPool conditions this expansion on the sentence. We study the efficacy of KGPool by evaluating it with different neural models and KGs (Wikidata and NYT Freebase). Our experimental evaluation on standard datasets shows that by feeding the KGPool representation into a Graph Neural Network, the overall method is significantly more accurate than state-of-the-art methods.
翻译:我们提出了一个从单句中提取关系的新颖方法(RE),在知识图(KG)中绘制句子和两个给定实体的图解,以了解事实。 特别是在这一假定的感应式RE设置中,单句的上下文往往很少。本文件介绍了KGPool 方法,以解决这种偏狭性,动态扩展背景,加上KG.G.的更多事实。它用神经学方法了解这些事实的表述(实体别名、实体描述等),补充感知背景。与静态使用所有扩大事实的现有方法不同,KGPool在句子上附加了这一扩展条件。我们用不同的神经模型和KG(维基数据和NYT Freebase)来评估KGOol的功效。我们对标准数据集的实验性评估表明,通过将KGPool的表示方式输入一个图形神经网络,总体方法比最新方法要准确得多。