The success of pretrained cross-lingual language models relies on two essential abilities, i.e., generalization ability for learning downstream tasks in a source language, and cross-lingual transferability for transferring the task knowledge to other languages. However, current methods jointly learn the two abilities in a single-phase cross-lingual pretraining process, resulting in a trade-off between generalization and cross-lingual transfer. In this paper, we propose cross-lingual language model meta-pretraining, which learns the two abilities in different training phases. Our method introduces an additional meta-pretraining phase before cross-lingual pretraining, where the model learns generalization ability on a large-scale monolingual corpus. Then, the model focuses on learning cross-lingual transfer on a multilingual corpus. Experimental results show that our method improves both generalization and cross-lingual transfer, and produces better-aligned representations across different languages.
翻译:预先培训的跨语言语言模式的成功取决于两种基本能力,即学习源语下游任务的一般化能力,以及将任务知识转让给其他语言的跨语言转让能力;然而,目前的方法在单阶段跨语言的预培训过程中共同学习这两种能力,从而在一般化和跨语言的转让之间取得平衡;在本文件中,我们建议采用跨语言模式的元培训模式,在不同培训阶段学习两种能力。我们的方法在跨语言的预培训之前又引入了一个新的元培训阶段,即该模式在大规模单一语言材料上学习一般化能力。然后,该模式侧重于在多语言材料上学习跨语言的转让。实验结果表明,我们的方法既改进了一般化又改进了跨语言的转让,并产生了不同语言之间更加一致的表述方式。