The channel attention mechanism is a useful technique widely employed in deep convolutional neural networks to boost the performance for image processing tasks, eg, image classification and image super-resolution. It is usually designed as a parameterized sub-network and embedded into the convolutional layers of the network to learn more powerful feature representations. However, current channel attention induces more parameters and therefore leads to higher computational costs. To deal with this issue, in this work, we propose a Parameter-Free Channel Attention (PFCA) module to boost the performance of popular image classification and image super-resolution networks, but completely sweep out the parameter growth of channel attention. Experiments on CIFAR-100, ImageNet, and DIV2K validate that our PFCA module improves the performance of ResNet on image classification and improves the performance of MSRResNet on image super-resolution tasks, respectively, while bringing little growth of parameters and FLOPs.
翻译:通道注意力机制是深度卷积神经网络中广泛应用于增强图像处理任务性能的有效技术,例如图像分类和图像超分辨率。它通常被设计成有参数的子网络,并嵌入到网络的卷积层中以学习更强大的特征表示。然而,当前的通道注意力引入更多的参数,因此导致更高的计算成本。为了应对这个问题,在这项工作中,我们提出了一个Parameter-Free Channel Attention (PFCA)模块,来提升常用的图像分类和图像超分辨率网络的性能,但完全消除了通道注意力的参数增长。在CIFAR-100、ImageNet和DIV2K的实验中验证了我们的PFCA模块分别改善了ResNet在图像分类上的性能,MSRResNet在图像超分辨率任务上的性能,同时带来了很少的参数和FLOPs增长。