Improving machine translation (MT) systems with translation memories (TMs) is of great interest to practitioners in the MT community. However, previous approaches require either a significant update of the model architecture and/or additional training efforts to make the models well-behaved when TMs are taken as additional input. In this paper, we present a simple but effective method to introduce TMs into neural machine translation (NMT) systems. Specifically, we treat TMs as prompts to the NMT model at test time, but leave the training process unchanged. The result is a slight update of an existing NMT system, which can be implemented in a few hours by anyone who is familiar with NMT. Experimental results on several datasets demonstrate that our system significantly outperforms strong baselines.
翻译:改进带有翻译记忆的机器翻译系统(MT)对于MT界的从业者具有极大的兴趣,然而,以往的做法要求要么对模型结构进行重大更新,要么在将TMM作为补充投入时,开展更多的培训工作,使模型表现良好。在本文件中,我们提出了一个简单而有效的方法,将TMS引入神经机翻译系统。具体地说,我们把TMS作为测试时NMT模型的提示,但保留了培训过程。结果是对现有NMT系统稍作更新,任何熟悉NMT的人可以在几个小时内实施该系统。 几个数据集的实验结果表明,我们的系统大大超出了强大的基准。