Recently, transformer-based methods have made impressive progress in single-image super-resolu-tion (SR). However, these methods are difficult to apply to lightweight SR (LSR) due to the challenge of balancing model performance and complexity. In this paper, we propose an efficient striped window transformer (ESWT). ESWT consists of efficient transformation layers (ETLs), allowing a clean structure and avoiding redundant operations. Moreover, we designed a striped window mechanism to obtain a more efficient ESWT in modeling long-term dependencies. To further exploit the potential of the transformer, we propose a novel flexible window training strategy. Without any additional cost, this strategy can further improve the performance of ESWT. Extensive experiments show that the proposed method outperforms state-of-the-art transformer-based LSR methods with fewer parameters, faster inference, smaller FLOPs, and less memory consumption, achieving a better trade-off between model performance and complexity.
翻译:最近,以变压器为基础的方法在单一图像超级溶解(SR)方面取得了令人印象深刻的进展。然而,由于在模型性能和复杂性之间保持平衡的挑战,这些方法很难适用于轻量的SR(LSR)。在本文件中,我们建议采用高效的条纹式窗口变压器(ESWT),ESWT由高效的变形层(ETLs)组成,允许清洁的结构,避免冗余作业。此外,我们设计了条纹式窗口机制,以便在形成长期依赖关系模型方面实现更有效的ESWT。为了进一步发掘变压器的潜力,我们建议采用新的灵活窗口培训战略。如果没有额外的成本,这一战略可以进一步改进ESWT的性能。广泛的实验表明,拟议的方法在模型性能和复杂性之间实现更好的交易,其参数比以最新型变压器为基础的LSR方法更低,参数越快,引引力越小,FLOPs越小,记忆消耗也越少。