Photorealistic style transfer aims to transfer the artistic style of an image onto an input image or video while keeping photorealism. In this paper, we think it's the summary statistics matching scheme in existing algorithms that leads to unrealistic stylization. To avoid employing the popular Gram loss, we propose a self-supervised style transfer framework, which contains a style removal part and a style restoration part. The style removal network removes the original image styles, and the style restoration network recovers image styles in a supervised manner. Meanwhile, to address the problems in current feature transformation methods, we propose decoupled instance normalization to decompose feature transformation into style whitening and restylization. It works quite well in ColoristaNet and can transfer image styles efficiently while keeping photorealism. To ensure temporal coherency, we also incorporate optical flow methods and ConvLSTM to embed contextual information. Experiments demonstrates that ColoristaNet can achieve better stylization effects when compared with state-of-the-art algorithms.
翻译:相片现实风格传输的目的是将图像的艺术风格转移到输入图像或视频中, 同时保持光现实主义。 在本文中, 我们认为这是在现有算法中导致不切实际的星系化的汇总统计匹配方案。 为了避免使用流行的格拉姆损失, 我们建议使用一个自监督的样式传输框架, 其中包含一个样式删除部分和样式恢复部分。 风格删除网络删除了原始图像样式, 风格恢复网络以监督的方式恢复图像样式。 与此同时, 为了解决当前特征转换方法中的问题, 我们提议将特征转换为样式白化和再现化的常规化。 它在CololistaNet中运作得很好, 并且可以在保持光现实主义的同时有效地转换图像样式样式。 为确保时间的一致性, 我们还将光学流法和CONLSTM 纳入到背景信息中。 实验表明, 与最先进的算法相比, ColistaNet 能够实现更好的元化效果。