We propose an effective lightweight dynamic local and global self-attention network (DLGSANet) to solve image super-resolution. Our method explores the properties of Transformers while having low computational costs. Motivated by the network designs of Transformers, we develop a simple yet effective multi-head dynamic local self-attention (MHDLSA) module to extract local features efficiently. In addition, we note that existing Transformers usually explore all similarities of the tokens between the queries and keys for the feature aggregation. However, not all the tokens from the queries are relevant to those in keys, using all the similarities does not effectively facilitate the high-resolution image reconstruction. To overcome this problem, we develop a sparse global self-attention (SparseGSA) module to select the most useful similarity values so that the most useful global features can be better utilized for the high-resolution image reconstruction. We develop a hybrid dynamic-Transformer block(HDTB) that integrates the MHDLSA and SparseGSA for both local and global feature exploration. To ease the network training, we formulate the HDTBs into a residual hybrid dynamic-Transformer group (RHDTG). By embedding the RHDTGs into an end-to-end trainable network, we show that our proposed method has fewer network parameters and lower computational costs while achieving competitive performance against state-of-the-art ones in terms of accuracy. More information is available at https://neonleexiang.github.io/DLGSANet/
翻译:我们提出一个有效的轻量度动态本地和全球自控网络(DLGSANet),以解决图像超分辨率问题。我们的方法是探索变异器的特性,同时计算成本低。我们受变异器网络设计的驱动,我们开发了一个简单而有效的多头动态本地自控模块(MHDLSA),以高效地提取本地特征。此外,我们注意到,现有的变异器通常探索特性集合查询和关键键之间所有象征物的相似性。然而,并非所有查询的象征物都与密钥中的人相关,使用所有相似之处都无法有效地促进高分辨率图像重建。为了克服这一问题,我们开发了一个稀疏的全球自控(SparseGSA)模块,以选择最有用的多头动态多功能动态本地自控模块(MHDLSA) 和 Sprassarge SA(SprassionalGSA) 的混合质标码。我们开发的HDTBB-LD-LA 的低端端网络成本,我们提出的RDF-DG-L-LTER-LA的升级成本将更低的服务器-G-透明的服务器-透明 GroDVDRDF-LTF-LV-LV-LV-LTF-LS-LS-LS-LTF-LTFTF-LTF-LVDF-LTF) 。我们提议的升级的升级成本将降低的升级的升级的服务器-LV-TF-TF-TF-TF-LTF-TF-LTF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-TF-IG-IG-IG-TFG-IG-IG-TF-TF-IG-TF-IG-TF-L-L-L-L-TF-L-L-T-TF-TF-TF-T-TF-T-T-T-T-T-L-L-