Transformers are quickly becoming one of the most heavily applied deep learning architectures across modalities, domains, and tasks. In vision, on top of ongoing efforts into plain transformers, hierarchical transformers have also gained significant attention, thanks to their performance and easy integration into existing frameworks. These models typically employ localized attention mechanisms, such as the sliding-window Neighborhood Attention (NA) or Swin Transformer's Shifted Window Self Attention. While effective at reducing self attention's quadratic complexity, local attention weakens two of the most desirable properties of self attention: long range inter-dependency modeling, and global receptive field. In this paper, we introduce Dilated Neighborhood Attention (DiNA), a natural, flexible and efficient extension to NA that can capture more global context and expand receptive fields exponentially at no additional cost. NA's local attention and DiNA's sparse global attention complement each other, and therefore we introduce Dilated Neighborhood Attention Transformer (DiNAT), a new hierarchical vision transformer built upon both. DiNAT variants enjoy significant improvements over strong baselines such as NAT, Swin, and ConvNeXt. Our large model is faster and ahead of its Swin counterpart by 1.6% box AP in COCO object detection, 1.4% mask AP in COCO instance segmentation, and 1.4% mIoU in ADE20K semantic segmentation. Paired with new frameworks, our large variant is the new state of the art panoptic segmentation model on COCO (58.5 PQ) and ADE20K (49.4 PQ), and instance segmentation model on Cityscapes (45.1 AP) and ADE20K (35.4 AP) (no extra data). It also matches the state of the art specialized semantic segmentation models on ADE20K (58.1 mIoU), and ranks second on Cityscapes (84.5 mIoU) (no extra data).
翻译:变压器正在迅速成为在模式、 域和任务中应用得最为密集的深层次学习结构之一。 在愿景中, 等级变压器由于表现和容易融入现有框架而获得显著关注。 这些模型通常使用本地关注机制, 如滑动窗口邻里注意(NA) 或 Swin变压器的移动窗口自关注。 在有效减少自我关注的二次复杂度的同时, 本地注意力会削弱两种最理想的自我关注特性: 远程间依赖模型和全球可接受字段。 在本文件中, 等级变压器也获得了显著关注。 我们引入了 淡化的 邻里堡注意( Diild- window Neform) (Diear- 20 NAT) (Dielector NAT) 的两种最合适的自我关注特性: 远程间距间距间距间距建模型, 以及全球可接受性变压式 QIAQ 。 在 NAT、 Swin K 和 CONA 的 等主控部分中, 直径直线路段的直径直线路段数据模型和直径直径直径直径。