Recent advances in the design of convolutional neural network (CNN) have yielded significant improvements in the performance of image super-resolution (SR). The boost in performance can be attributed to the presence of residual or dense connections within the intermediate layers of these networks. The efficient combination of such connections can reduce the number of parameters drastically while maintaining the restoration quality. In this paper, we propose a scale recurrent SR architecture built upon units containing series of dense connections within a residual block (Residual Dense Blocks (RDBs)) that allow extraction of abundant local features from the image. Our scale recurrent design delivers competitive performance for higher scale factors while being parametrically more efficient as compared to current state-of-the-art approaches. To further improve the performance of our network, we employ multiple residual connections in intermediate layers (referred to as Multi-Residual Dense Blocks), which improves gradient propagation in existing layers. Recent works have discovered that conventional loss functions can guide a network to produce results which have high PSNRs but are perceptually inferior. We mitigate this issue by utilizing a Generative Adversarial Network (GAN) based framework and deep feature (VGG) losses to train our network. We experimentally demonstrate that different weighted combinations of the VGG loss and the adversarial loss enable our network outputs to traverse along the perception-distortion curve. The proposed networks perform favorably against existing methods, both perceptually and objectively (PSNR-based) with fewer parameters.
翻译:在本文中,我们建议一个规模的经常性SR结构,它建立在含有残余区块内一系列密集连接的单位(Residual Dense blocks (RDBs))上,能够从图像中提取丰富的本地参数。我们经常使用的设计为更高规模的因素提供了竞争性的性能,而与目前的最新方法相比,其效率则相当高。为了进一步改善我们的网络,我们在中间层(称为多反向感应区块)使用多种剩余连接,从而改进现有层的加速传播。最近的工作发现,传统的损失功能可以指导一个网络产生高PSNR(Redial Dense blocks)但明显较低的本地参数。我们通过利用General Adversari 网络(GANAniversal com)来缓解这一问题,而与目前的最新先进方法相比,其效率则更高。为了进一步改进我们的网络的业绩,我们在中间层(称为多反向思维区块区块)中采用多种剩余连接,从而改进现有水平传播。最近的工作发现,常规损失功能可以引导一个网络产生高PSNRRR,但明显低级。我们目前使用的轨道网络的虚拟网络和深层损失的模型模型模型显示我们的现有模型的模型的模型的模型的模型的模型。