Modeling long texts has been an essential technique in the field of natural language processing (NLP). With the ever-growing number of long documents, it is important to develop effective modeling methods that can process and analyze such texts. However, long texts pose important research challenges for existing text models, with more complex semantics and special characteristics. In this paper, we provide an overview of the recent advances on long texts modeling based on Transformer models. Firstly, we introduce the formal definition of long text modeling. Then, as the core content, we discuss how to process long input to satisfy the length limitation and design improved Transformer architectures to effectively extend the maximum context length. Following this, we discuss how to adapt Transformer models to capture the special characteristics of long texts. Finally, we describe four typical applications involving long text modeling and conclude this paper with a discussion of future directions. Our survey intends to provide researchers with a synthesis and pointer to related work on long text modeling.
翻译:长篇案文是自然语言处理领域的一项基本技术。随着长篇文件越来越多,必须制定能够处理和分析长篇文件的有效示范方法。不过,长篇案文对现有文本模型提出了重要的研究挑战,这些模型具有更为复杂的语义和特殊性。在本文件中,我们概述了以变形模型为基础的长篇案文模型的最新进展。首先,我们引入长篇案文模型的正式定义。然后,作为核心内容,我们讨论如何处理长篇投入,以满足长度限制,并设计经改进的变换器结构,以有效延长最长的篇幅。随后,我们讨论如何调整变换器模型,以捕捉长篇案文的特殊性。最后,我们描述了涉及长篇案文模型的四种典型应用,并在结束本文件时讨论今后的方向。我们的调查旨在为研究人员提供有关长篇案文模型的相关工作提供综合和指针。</s>