With the advance of natural language inference (NLI), a rising demand for NLI is to handle scientific texts. Existing methods depend on pre-trained models (PTM) which lack domain-specific knowledge. To tackle this drawback, we introduce a scientific knowledge graph to generalize PTM to scientific domain. However, existing knowledge graph construction approaches suffer from some drawbacks, i.e., expensive labeled data, failure to apply in other domains, long inference time and difficulty extending to large corpora. Therefore, we propose an unsupervised knowledge graph construction method to build a scientific knowledge graph (SKG) without any labeled data. Moreover, to alleviate noise effect from SKG and complement knowledge in sentences better, we propose an event-centric knowledge infusion method to integrate external knowledge into each event that is a fine-grained semantic unit in sentences. Experimental results show that our method achieves state-of-the-art performance and the effectiveness and reliability of SKG.
翻译:随着自然语言推论的进步,对自然语言推论(NLI)的不断增长的需求是处理科学文本。现有的方法取决于缺乏特定领域知识的预先训练模型(PTM) 。为了解决这一缺陷,我们引入了一个科学知识图表,将PTM推广到科学领域。然而,现有的知识图构建方法存在一些缺陷,例如,昂贵的标签数据、未能在其他领域应用、长期推论时间和难度延伸到大型公司。因此,我们建议一种不受监督的知识图构建方法,在没有标签数据的情况下构建科学知识图表(SKG) 。此外,为了减轻SKG的噪音效应并更好地补充句子中的知识,我们建议一种以事件为中心的知识融合方法,将外部知识纳入每个事件,而每个事件都是一个精细的语义中的语义单元。实验结果表明,我们的方法达到了最先进的性能以及SKG的有效性和可靠性。