In recent years, Neural Machine Translation (NMT) has achieved notable results in various translation tasks. However, the word-by-word generation manner determined by the autoregressive mechanism leads to high translation latency of the NMT and restricts its low-latency applications. Non-Autoregressive Neural Machine Translation (NAT) removes the autoregressive mechanism and achieves significant decoding speedup through generating target words independently and simultaneously. Nevertheless, NAT still takes the word-level cross-entropy loss as the training objective, which is not optimal because the output of NAT cannot be properly evaluated due to the multimodality problem. In this paper, we propose using sequence-level training objectives to train NAT models, which evaluate the NAT outputs as a whole and correlates well with the real translation quality. Firstly, we propose training NAT models to optimize sequence-level evaluation metrics (e.g., BLEU) based on several novel reinforcement algorithms customized for NAT, which outperforms the conventional method by reducing the variance of gradient estimation. Secondly, we introduce a novel training objective for NAT models, which aims to minimize the Bag-of-Ngrams (BoN) difference between the model output and the reference sentence. The BoN training objective is differentiable and can be calculated efficiently without doing any approximations. Finally, we apply a three-stage training strategy to combine these two methods to train the NAT model. We validate our approach on four translation tasks (WMT14 En$\leftrightarrow$De, WMT16 En$\leftrightarrow$Ro), which shows that our approach largely outperforms NAT baselines and achieves remarkable performance on all translation tasks.


翻译:近些年来,神经机器翻译(NMT)在各种翻译任务中取得了显著成果,然而,由自动递减机制决定的逐字生成方式导致NMT的高翻译延迟度,并限制其低延迟应用。非自动递减神经机器翻译(NAT)消除了自动递减机制,并通过独立和同时生成目标单词实现了显著解码速度。然而,NAT仍然将字级跨职业流失作为培训目标,这并非最佳,因为由于多式联运问题,NAT产出无法得到适当评估。在本文件中,我们提议使用序列级培训目标来培训NAT模型模型,将NAT产出作为一个整体进行评估,并与真正的翻译质量相联系。首先,我们提议培训NAT模型,通过独立和同时生成目标单级评价标准(例如,BLEUEU),根据我们为NAT定制的几套新的强化算法,这比常规方法要好,因为降低梯度估计值。第二,我们提出一个新的培训目标是NAT模型,将NAT模型和NMMT模型的升级模型合并为最终输出。

0
下载
关闭预览

相关内容

机器翻译(Machine Translation)涵盖计算语言学和语言工程的所有分支,包含多语言方面。特色论文涵盖理论,描述或计算方面的任何下列主题:双语和多语语料库的编写和使用,计算机辅助语言教学,非罗马字符集的计算含义,连接主义翻译方法,对比语言学等。 官网地址:http://dblp.uni-trier.de/db/journals/mt/
【Google】无监督机器翻译,Unsupervised Machine Translation
专知会员服务
36+阅读 · 2020年3月3日
A Technical Overview of AI & ML in 2018 & Trends for 2019
待字闺中
17+阅读 · 2018年12月24日
Machine Learning:十大机器学习算法
开源中国
21+阅读 · 2018年3月1日
条件GAN重大改进!cGANs with Projection Discriminator
CreateAMind
8+阅读 · 2018年2月7日
【论文】变分推断(Variational inference)的总结
机器学习研究会
39+阅读 · 2017年11月16日
自然语言处理(二)机器翻译 篇 (NLP: machine translation)
DeepLearning中文论坛
12+阅读 · 2015年7月1日
Arxiv
5+阅读 · 2018年5月28日
Arxiv
5+阅读 · 2018年1月16日
VIP会员
相关VIP内容
【Google】无监督机器翻译,Unsupervised Machine Translation
专知会员服务
36+阅读 · 2020年3月3日
Top
微信扫码咨询专知VIP会员