The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in response to a query. Although the most common formulation of text ranking is search, instances of the task can also be found in many natural language processing applications. This survey provides an overview of text ranking with neural network architectures known as transformers, of which BERT is the best-known example. The combination of transformers and self-supervised pretraining has, without exaggeration, revolutionized the fields of natural language processing (NLP), information retrieval (IR), and beyond. In this survey, we provide a synthesis of existing work as a single point of entry for practitioners who wish to gain a better understanding of how to apply transformers to text ranking problems and researchers who wish to pursue work in this area. We cover a wide range of modern techniques, grouped into two high-level categories: transformer models that perform reranking in multi-stage ranking architectures and learned dense representations that attempt to perform ranking directly. There are two themes that pervade our survey: techniques for handling long documents, beyond the typical sentence-by-sentence processing approaches used in NLP, and techniques for addressing the tradeoff between effectiveness (result quality) and efficiency (query latency). Although transformer architectures and pretraining techniques are recent innovations, many aspects of how they are applied to text ranking are relatively well understood and represent mature techniques. However, there remain many open research questions, and thus in addition to laying out the foundations of pretrained transformers for text ranking, this survey also attempts to prognosticate where the field is heading.
翻译:文本排序的目的是生成一份有顺序的文本清单,对质询作出回应。虽然最常见的文本排序提法是搜索,但在许多自然语言处理应用程序中也可以找到任务实例。本调查概述了与被称为变压器的神经网络结构的文本排序,BERT是其中最著名的例子。变压器和自我监督的预培训相结合,在不夸张的情况下,将自然语言处理领域(NLP)、信息检索(IR)及以后的等领域进行了革命。在本次调查中,我们综合了现有工作,将其作为一个单一的入门点,供那些希望更好地了解如何将变压器用于文本排序问题的从业人员和希望在这一领域开展工作的研究人员使用。我们涵盖广泛的现代技术,分为两类:变压器在多阶段排序结构中进行重新排序的变压器模型和试图直接进行排序的较密集的演示。我们的调查有两个主题:处理长文件的技术,超越典型的逐句处理基础的尝试,从而更好地理解如何将变压器用于文本排序中,因此,在最新变压技术中,变压技术如何在结构前进行。