In this paper, we present Neural Phrase-based Machine Translation (NPMT). Our method explicitly models the phrase structures in output sequences using Sleep-WAke Networks (SWAN), a recently proposed segmentation-based sequence modeling method. To mitigate the monotonic alignment requirement of SWAN, we introduce a new layer to perform (soft) local reordering of input sequences. Different from existing neural machine translation (NMT) approaches, NPMT does not use attention-based decoding mechanisms. Instead, it directly outputs phrases in a sequential order and can decode in linear time. Our experiments show that NPMT achieves superior performances on IWSLT 2014 German-English/English-German and IWSLT 2015 English-Vietnamese machine translation tasks compared with strong NMT baselines. We also observe that our method produces meaningful phrases in output languages.
翻译:在本文中,我们展示了基于神经的语句机器翻译(NPMT) 。 我们的方法是明确用最近提议的基于分离的序列建模方法(SWAN)来模拟产出序列中的语句结构。 为了减轻SWAN的单声调调整要求,我们引入一个新的层次来对输入序列进行(软的)本地重新排序。 不同于现有的神经机翻译(NMT)方法, NPET没有使用基于注意的解码机制。 相反,它直接按顺序排列输出短语,可以在线性时间解码。 我们的实验显示,NPNPT在2014年IWSLT 德语/英语/英语/德语和IWSLT 2015 英语-越南语机器翻译任务上取得了优异的成绩,而NMT基准则很强。 我们还观察到,我们的方法产生了产出语言中有意义的短语。