Multi-task learning (MTL) has become increasingly popular in natural language processing (NLP) because it improves the performance of related tasks by exploiting their commonalities and differences. Nevertheless, it is still not understood very well how multi-task learning can be implemented based on the relatedness of training tasks. In this survey, we review recent advances of multi-task learning methods in NLP, with the aim of summarizing them into two general multi-task training methods based on their task relatedness: (i) joint training and (ii) multi-step training. We present examples in various NLP downstream applications, summarize the task relationships and discuss future directions of this promising topic.
翻译:多任务学习(MTL)在自然语言处理(NLP)中越来越受欢迎,因为它通过利用不同和共同之处改进了相关任务的执行,然而,仍然不能很好地理解如何根据培训任务的相关性执行多任务学习(MTL),在这次调查中,我们审查了国家语言处理(MTL)中多任务学习方法的最新进展,目的是根据其任务相关性,将其归纳为两种一般的多任务培训方法:(一) 联合培训和(二) 多步骤培训,我们在各种NLP下游应用中举例说明,总结任务关系,并讨论这一大有希望的专题的未来方向。