Compared to the general news domain, information extraction (IE) from biomedical text requires much broader domain knowledge. However, many previous IE methods do not utilize any external knowledge during inference. Due to the exponential growth of biomedical publications, models that do not go beyond their fixed set of parameters will likely fall behind. Inspired by how humans look up relevant information to comprehend a scientific text, we present a novel framework that utilizes external knowledge for joint entity and relation extraction named KECI (Knowledge-Enhanced Collective Inference). Given an input text, KECI first constructs an initial span graph representing its initial understanding of the text. It then uses an entity linker to form a knowledge graph containing relevant background knowledge for the the entity mentions in the text. To make the final predictions, KECI fuses the initial span graph and the knowledge graph into a more refined graph using an attention mechanism. KECI takes a collective approach to link mention spans to entities by integrating global relational information into local representations using graph convolutional networks. Our experimental results show that the framework is highly effective, achieving new state-of-the-art results in two different benchmark datasets: BioRelEx (binding interaction detection) and ADE (adverse drug event extraction). For example, KECI achieves absolute improvements of 4.59% and 4.91% in F1 scores over the state-of-the-art on the BioRelEx entity and relation extraction tasks.
翻译:与一般新闻领域相比,生物医学文本的信息提取(IE)需要更广泛的领域知识。然而,许多以前的IE方法在推理过程中没有利用任何外部知识。由于生物医学出版物的指数增长,不超出固定参数集的模型可能落在后面。由于人类如何利用外部知识来了解科学文本,我们提出了一个新框架,利用外部知识来了解联合实体和关系提取名为KECI(知识-增强集体推理)的KECI(知识-强化集体推理))的外部知识。根据输入文本,KECI首先构建一个代表其初步理解文本的初步的跨局图。然后,它使用实体链接器来形成一个包含该实体在文本中提及的相关背景知识的知识图表。为了作出最后预测,KECI将初始的跨局图和知识图表结合成一个更精确的图表。KECI采取集体方法,通过将全球关系信息纳入地方表达方式,使用图表革命网络。我们的实验结果表明,框架非常有效,实现了新的 State-for-for-for-for-lect-ime-imal eximal relational eximal exal exal exal exal exal exal laction the ex the ex ex expact expactal expactal exal expactsal ex ex ex ex ex ex ex ex ex ex divitalmentalmentalmentalmentalprestipeutitalments 4.