Transformers have achieved success in both language and vision domains. However, it is prohibitively expensive to scale them to long sequences such as long documents or high-resolution images, because self-attention mechanism has quadratic time and memory complexities with respect to the input sequence length. In this paper, we propose Long-Short Transformer (Transformer-LS), an efficient self-attention mechanism for modeling long sequences with linear complexity for both language and vision tasks. It aggregates a novel long-range attention with dynamic projection to model distant correlations and a short-term attention to capture fine-grained local correlations. We propose a dual normalization strategy to account for the scale mismatch between the two attention mechanisms. Transformer-LS can be applied to both autoregressive and bidirectional models without additional complexity. Our method outperforms the state-of-the-art models on multiple tasks in language and vision domains, including the Long Range Arena benchmark, autoregressive language modeling, and ImageNet classification. For instance, Transformer-LS achieves 0.97 test BPC on enwik8 using half the number of parameters than previous method, while being faster and is able to handle 3x as long sequences compared to its full-attention version on the same hardware. On ImageNet, it can obtain the state-of-the-art results (e.g., a moderate size of 55.8M model solely trained on 224x224 ImageNet-1K can obtain Top-1 accuracy 84.1%), while being more scalable on high-resolution images. The source code and models are released at https://github.com/NVIDIA/transformer-ls .
翻译:变异器在语言和视觉领域都取得了成功。 但是, 将它们缩放成长序列, 如长文档或高分辨率图像, 成本太高, 实在太高, 实在太高, 因为自留机制在输入序列长度方面有四倍的时间和记忆复杂性。 在本文中, 我们提议长肖特变异器( Transfred- LS), 这是一种高效的自我留意机制, 用于为语言和视觉任务建模具有线性复杂性的长序列模型。 它将新颖的长距离关注与对模型型远端关系进行动态预测, 并短期关注捕捉取精细度的本地相关关系。 我们提出一个双重正常化战略, 以计算两个注意机制之间的比例错配。 变异器- LS 可以应用自动反向和双向的模型。 我们的方法超越了语言和视觉领域多个任务中最先进的模型, 包括长范围阿伦纳基准、 自动递增语言建模和图像网络分类。 例如, 变式- LS- LS 在 Ewik.8 上进行 测试 BPC 和 BPC, 。 使用一半的高级变式 速度,, 将 和 快速 进行 快速,, 进行 和 快速 进行, 和 快速 和 快速 进行 进行 进行 常规 常规, 和 常规 进行 进行 进行 的, 常规, 进行 进行, 进行, 和 进行 常规 进行 进行 进行 进行 进行 进行, 进行 进行 。