Named entity recognition (NER) and entity linking (EL) are two fundamentally related tasks, since in order to perform EL, first the mentions to entities have to be detected. However, most entity linking approaches disregard the mention detection part, assuming that the correct mentions have been previously detected. In this paper, we perform joint learning of NER and EL to leverage their relatedness and obtain a more robust and generalisable system. For that, we introduce a model inspired by the Stack-LSTM approach (Dyer et al., 2015). We observe that, in fact, doing multi-task learning of NER and EL improves the performance in both tasks when comparing with models trained with individual objectives. Furthermore, we achieve results competitive with the state-of-the-art in both NER and EL.
翻译:命名实体识别(NER)和连接实体(EL)是两项基本相关的任务,因为为了执行EL,首先必须发现对实体的提及,然而,大多数连接方法的实体不考虑提及检测部分,假定正确的提及是先前发现的;在本文件中,我们共同学习NER和EL,以利用它们之间的联系,并获得一个更健全和可比较的系统;为此,我们引入了一个受Stack-LSTM方法启发的模型(Dyer等人,2015年),我们注意到,事实上,对NER和EL进行多重任务学习,在与为个别目标而培训的模式进行比较时,可以改善这两项任务的业绩;此外,我们还取得了与NER和EL中最新技术具有竞争力的成果。