Pre-training (PT) and back-translation (BT) are two simple and powerful methods to utilize monolingual data for improving the model performance of neural machine translation (NMT). This paper takes the first step to investigate the complementarity between PT and BT. We introduce two probing tasks for PT and BT respectively and find that PT mainly contributes to the encoder module while BT brings more benefits to the decoder. Experimental results show that PT and BT are nicely complementary to each other, establishing state-of-the-art performances on the WMT16 English-Romanian and English-Russian benchmarks. Through extensive analyses on sentence originality and word frequency, we also demonstrate that combining Tagged BT with PT is more helpful to their complementarity, leading to better translation quality. Source code is freely available at https://github.com/SunbowLiu/PTvsBT.
翻译:培训前(PT)和回译(BT)是使用单语数据改进神经机翻译模型(NMT)的两种简单而有力的方法,本文件是调查PT和BT之间互补性的第一步。我们分别为PT和BT引入了两项检验任务,发现PT主要有助于编码模块,而BT给解译器带来更多好处。实验结果表明,PT和BT相辅相成,建立了WMT16英语-罗马尼亚语和英语-俄语基准的最新表现。通过对原判和字数频率的广泛分析,我们还表明,将Ttaged BT与PT相结合,对其互补性更有帮助,从而提高翻译质量。源码可免费查阅https://github.com/SunbowLiu/PTvsBT。