The performance of relation extraction models has increased considerably with the rise of neural networks. However, a key issue of neural relation extraction is robustness: the models do not scale well to long sentences with multiple entities and relations. In this work, we address this problem with an enriched attention mechanism. Attention allows the model to focus on parts of the input sentence that are relevant to relation extraction. We propose to enrich the attention function with features modeling knowledge about the relation arguments and the shortest dependency path between them. Thus, for different relation arguments, the model can pay attention to different parts of the sentence. Our model outperforms prior work using comparable setups on two popular benchmarks, and our analysis confirms that it indeed scales to long sentences with many entities.
翻译:随着神经网络的兴起,关系提取模型的性能有了相当大的提高,然而,神经关系提取的一个关键问题是稳健性:模型不适宜与多个实体和关系进行长期判决。在这项工作中,我们用一个更丰富的关注机制来解决这个问题。注意使模型能够侧重于与关系提取相关的部分投入句子。我们提议通过对关系争议和它们之间最短的依附性路径进行模拟知识来丰富关注功能。因此,对于不同关系争论,模型可以关注该句的不同部分。我们的模型在使用两个流行基准的可比设置方面比以往的工作要好,我们的分析证实,它确实比许多实体的刑期长。