It has been shown that the performance of neural machine translation (NMT) drops starkly in low-resource conditions, often requiring large amounts of auxiliary data to achieve competitive results. An effective method of generating auxiliary data is back-translation of target language sentences. In this work, we present a case study of Tigrinya where we investigate several back-translation methods to generate synthetic source sentences. We find that in low-resource conditions, back-translation by pivoting through a higher-resource language related to the target language proves most effective resulting in substantial improvements over baselines.
翻译:事实已经表明,神经机翻译(NMT)的性能在低资源条件下明显下降,往往需要大量辅助数据才能取得竞争性结果;生成辅助数据的有效方法是将目标语言句次译为反译;在这项工作中,我们介绍了对Tigrinya的案例研究,我们调查了几种反译方法,以产生合成源句子;我们发现,在低资源条件下,通过与目标语言有关的高资源语言进行反译证明最为有效,结果大大改进了基线。