Current state-of-the-art semantic role labeling (SRL) uses a deep neural network with no explicit linguistic features. However, prior work has shown that gold syntax trees can dramatically improve SRL decoding, suggesting the possibility of increased accuracy from explicit modeling of syntax. In this work, we present linguistically-informed self-attention (LISA): a neural network model that combines multi-head self-attention with multi-task learning across dependency parsing, part-of-speech tagging, predicate detection and SRL. Unlike previous models which require significant pre-processing to prepare linguistic features, LISA can incorporate syntax using merely raw tokens as input, encoding the sequence only once to simultaneously perform parsing, predicate detection and role labeling for all predicates. Syntax is incorporated by training one attention head to attend to syntactic parents for each token. Moreover, if a high-quality syntactic parse is already available, it can be beneficially injected at test time without re-training our SRL model. In experiments on CoNLL-2005 SRL, LISA achieves new state-of-the-art performance for a model using predicted predicates and standard word embeddings, attaining 2.5 F1 absolute higher than the previous state-of-the-art on newswire and more than 3.5 F1 on out-of-domain data, nearly 10% reduction in error. On ConLL-2012 English SRL we also show an improvement of more than 2.5 F1. LISA also out-performs the state-of-the-art with contextually-encoded (ELMo) word representations, by nearly 1.0 F1 on news and more than 2.0 F1 on out-of-domain text.
翻译:目前最先进的语义标记(SRL) 使用一个没有明确语言特征的深度神经神经网络。 但是, 先前的工作表明, 黄金语法树可以大幅改进SRL的解码, 这表明从明确的语法建模中提高准确性的可能性。 在这项工作中, 我们展示语言知情的自我注意( LISA ): 神经网络模型, 将多头自我注意与多任务跨依赖分析的多任务学习、 部分语音标记、 上游探测和SRL 相结合。 与以前需要大量预处理来制作语言特征的模式不同, LISA 可以使用光源符号来加入语法解码, 仅将序列编码一次, 以同时执行对语法学的标注、 上游探测和标记。 此外, 多头自导自导自导和多功能学习, 在测试时, 与新智能的 SRLLM 模型相比, 最低自译自译为FLLL, 在CON- LISL 的绝对性表现中, 也比FLAL- AS- AS- dal- dal- disal- dal- dal- dismal- dal- dal- disal- disaldaldaldal Staldal deal Stal ex lautmental lautal lautal lauts lauts lax lautd lautdal lauts lax lax lax lax lax lax lax lax lautdal lautsal lax lax lautsal lax lax lautsal lax lax lautsal lax lautsal lautsal lautsal lautd lautdal lax lax lax lax lad labal lautsal lautdal lax ladal lad ladal lad lax lax ladal la la la la la la la la la la la la la la lax la la