Neural chat translation (NCT) aims to translate a cross-lingual chat between speakers of different languages. Existing context-aware NMT models cannot achieve satisfactory performances due to the following inherent problems: 1) limited resources of annotated bilingual dialogues; 2) the neglect of modelling conversational properties; 3) training discrepancy between different stages. To address these issues, in this paper, we propose a multi-task multi-stage transitional (MMT) training framework, where an NCT model is trained using the bilingual chat translation dataset and additional monolingual dialogues. We elaborately design two auxiliary tasks, namely utterance discrimination and speaker discrimination, to introduce the modelling of dialogue coherence and speaker characteristic into the NCT model. The training process consists of three stages: 1) sentence-level pre-training on large-scale parallel corpus; 2) intermediate training with auxiliary tasks using additional monolingual dialogues; 3) context-aware fine-tuning with gradual transition. Particularly, the second stage serves as an intermediate phase that alleviates the training discrepancy between the pre-training and fine-tuning stages. Moreover, to make the stage transition smoother, we train the NCT model using a gradual transition strategy, i.e., gradually transiting from using monolingual to bilingual dialogues. Extensive experiments on two language pairs demonstrate the effectiveness and superiority of our proposed training framework.
翻译:为解决这些问题,我们在本文件中提议了一个多任务、多任务、多阶段过渡(MMT)培训框架,利用双语聊天翻译数据集和额外的单语对话来培训NCT模式。我们精心设计了两项辅助性任务,即:言语歧视和语言歧视,以将对话一致性和演讲者特点的建模引入NCT模式。培训进程分为三个阶段:1)关于大规模平行建筑的判刑前培训;2)利用额外的单语对话进行中级培训,辅助性任务;3)在逐步过渡过程中进行背景调整。特别是,第二阶段是缓解培训前和微调阶段之间培训差距的中间阶段。此外,为了使阶段过渡更加顺利,我们用一种逐步过渡战略培训NCT模式,将对话的一致性和演讲者特点引入NCT模式引入NCT模式。培训进程由三个阶段组成:1)关于大规模平行建筑的判刑前培训;2)使用额外的单语对话进行中期培训,辅助性任务;3)在逐步过渡阶段调整。特别是,第二阶段是缓解培训前和微调阶段之间培训差距的中间阶段。此外,我们用一种逐步过渡战略,即双语高语言对话的过渡试验,逐步展示双调。