In this work, we show a novel method for neural machine translation (NMT), using a denoising diffusion probabilistic model (DDPM), adjusted for textual data, following recent advances in the field. We show that it's possible to translate sentences non-autoregressively using a diffusion model conditioned on the source sentence. We also show that our model is able to translate between pairs of languages unseen during training (zero-shot learning).
翻译:在这项工作中,我们展示了一种神经机器翻译的新方法(NMT ), 使用一种经文本数据调整的分级扩散概率模型(DDPM ), 并随着最近实地的进展而调整。 我们展示了使用以源句为条件的不偏向的传播模型翻译判决的可能性。 我们还展示了我们的模型能够翻译在培训期间看不见的两种语言(零光学习 ) 。