Although single-image super-resolution (SISR) methods have achieved great success on single degradation, they still suffer performance drop with multiple degrading effects in real scenarios. Recently, some blind and non-blind models for multiple degradations have been explored. However, those methods usually degrade significantly for distribution shifts between the training and test data. Towards this end, we propose a conditional meta-network framework (named CMDSR) for the first time, which helps SR framework learn how to adapt to changes in input distribution. We extract degradation prior at task-level with the proposed ConditionNet, which will be used to adapt the parameters of the basic SR network (BaseNet). Specifically, the ConditionNet of our framework first learns the degradation prior from a support set, which is composed of a series of degraded image patches from the same task. Then the adaptive BaseNet rapidly shifts its parameters according to the conditional features. Moreover, in order to better extract degradation prior, we propose a task contrastive loss to decrease the inner-task distance and increase the cross-task distance between task-level features. Without predefining degradation maps, our blind framework can conduct one single parameter update to yield considerable SR results. Extensive experiments demonstrate the effectiveness of CMDSR over various blind, even non-blind methods. The flexible BaseNet structure also reveals that CMDSR can be a general framework for large series of SISR models. Our code is available at \url{https://github.com/guanghaoyin/CMDSR}.
翻译:尽管单一图像超分辨率(SISSR)方法在单一降解方面取得了巨大成功,但在实际情景中,它们仍然受到多种有辱人格的影响,其性能下降。最近,对一些盲人和非盲人的多重降解模型进行了探讨;然而,这些方法通常在培训和测试数据之间的分布变化中显著降解。为此,我们首次提议一个有条件的元网络框架(名为CMDSR),帮助SR框架学会如何适应投入分布的变化。我们提议在任务级别之前与拟议的调控网(将用来调整基本SR网络(BaseNet)的参数)进行调控。具体地说,我们框架的调控网首先从一个支持组(由一系列与同一任务相隔的退化图像补组成)中学习退化。然后,适应性基础网络将其参数根据有条件特征迅速转换。此外,为了更好地吸取之前的退化,我们提议一个任务对比性损失,以减少内塔克距离,并增加任务级别特性之间的跨任务级别距离。我们框架的调控模型(Basedefinal del del lasal SR) 也显示一个单一的C-SR(Cmalibly) resental exal resmal resmal resul resmal resul resulation resulation resulation resual resulation resulation resulation resulation ex) ex resulation resulational resubal resubal ex ex resmal resmal resm resm resmal resmlation resmal ex ex ex ex ex ex resulation ex resubal lautus lautus lautus ex ex lautus laut ex ex ex ex ex ex ex ex ex ex ex ex resm ex ex resmal labal ex ex ex ex ex labal labal labal ex ex ex ex ex ex ex ex ex exal ex exal labal exal laus ex ex ex ex ex