Pretrained language models have improved effectiveness on numerous tasks, including ad-hoc retrieval. Recent work has shown that continuing to pretrain a language model with auxiliary objectives before fine-tuning on the retrieval task can further improve retrieval effectiveness. Unlike monolingual retrieval, designing an appropriate auxiliary task for cross-language mappings is challenging. To address this challenge, we use comparable Wikipedia articles in different languages to further pretrain off-the-shelf multilingual pretrained models before fine-tuning on the retrieval task. We show that our approach yields improvements in retrieval effectiveness.
翻译:受过训练的语言模型提高了许多任务的有效性,包括临时检索。最近的工作表明,在对检索任务进行微调之前,继续预先培训具有辅助目标的语文模型可以进一步提高检索的有效性。 与单语检索不同,设计适当的跨语言绘图辅助任务具有挑战性。 为了应对这一挑战,我们在对检索任务进行微调之前,使用不同语言的可比维基百科文章来进一步预先培训现成的多语种预先培训模型。 我们显示,我们的方法提高了检索的有效性。