Neural Machine Translation (NMT) has drawn much attention due to its promising translation performance recently. However, several studies indicate that NMT often generates fluent but unfaithful translations. In this paper, we propose a method to alleviate this problem by using a phrase table as recommendation memory. The main idea is to add bonus to words worthy of recommendation, so that NMT can make correct predictions. Specifically, we first derive a prefix tree to accommodate all the candidate target phrases by searching the phrase translation table according to the source sentence. Then, we construct a recommendation word set by matching between candidate target phrases and previously translated target words by NMT. After that, we determine the specific bonus value for each recommendable word by using the attention vector and phrase translation probability. Finally, we integrate this bonus value into NMT to improve the translation results. The extensive experiments demonstrate that the proposed methods obtain remarkable improvements over the strong attentionbased NMT.
翻译:神经机器翻译(NMT)最近因其有希望的翻译性能而引起人们的极大关注。 但是,一些研究表明,NMT常常产生流利但不忠实的翻译。 在本文中,我们提出一种方法,通过使用一个短语表作为建议内存来缓解这一问题。主要的想法是给值得推荐的词添加奖金,以便NMT能够作出正确的预测。具体地说,我们首先得出一个前缀树,通过根据来源句搜索词句翻译表来容纳所有候选人的目标词句。然后,我们通过匹配候选人目标词句和NMT先前翻译的目标词来构建一个建议词。之后,我们通过使用关注矢量和翻译概率来确定每个推荐词的具体奖金值。最后,我们将这一奖金值纳入NMT,以改进翻译结果。广泛的实验表明,拟议的方法在以强烈关注为基础的NMT上取得了显著的改进。