Deep convolutional neural networks (CNNs) have been widely applied for low-level vision over the past five years. According to nature of different applications, designing appropriate CNN architectures is developed. However, customized architectures gather different features via treating all pixel points as equal to improve the performance of given application, which ignores the effects of local power pixel points and results in low training efficiency. In this paper, we propose an asymmetric CNN (ACNet) comprising an asymmetric block (AB), a mem?ory enhancement block (MEB) and a high-frequency feature enhancement block (HFFEB) for image super-resolution. The AB utilizes one-dimensional asymmetric convolutions to intensify the square convolution kernels in horizontal and vertical directions for promoting the influences of local salient features for SISR. The MEB fuses all hierarchical low-frequency features from the AB via residual learning (RL) technique to resolve the long-term dependency problem and transforms obtained low-frequency fea?tures into high-frequency features. The HFFEB exploits low- and high-frequency features to obtain more robust super-resolution features and address excessive feature enhancement problem. Ad?ditionally, it also takes charge of reconstructing a high-resolution (HR) image. Extensive experiments show that our ACNet can effectively address single image super-resolution (SISR), blind SISR and blind SISR of blind noise problems. The code of the ACNet is shown at https://github.com/hellloxiaotian/ACNet.
翻译:近五年来,深相神经网络(CNNs)被广泛应用于低水平视觉。根据不同应用程序的性质,设计了适当的CNN架构;然而,定制架构通过将所有像素点作为等效处理,收集不同功能,以提高特定应用程序的性能,这忽视了当地权力像素点的影响,并导致培训效率低。本文建议采用非对称CNN(ACNet)技术,由不对称块(AB)、记忆增强块(MEB)和高频特性增强块(HFFEB)组成,用于图像超分辨率。ABB利用一维度不对称组合,强化横向和垂直方向的平面内核内核,以提升SIMSR的局部特征。MEB通过残余学习(RL)将AB的所有低级别低频特性结合到长期依赖性问题,将获得的低频Fea? 将高频信号增强成高频特性。HFFEB利用低频和高频的CN-CN-RM(A-R-SR) 展示高频特性,以获得更强的A-SIS-SR(A-SIMR) 图像的高级实验性、高分辨率和高分辨率,展示了A-SIM-SR(A-L-A-L-A-A-SIM-A-A-A-A-A-A-S-S-S-S-A-A-S-A-S-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-S-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A