The term attribute transfer refers to the tasks of altering images in such a way, that the semantic interpretation of a given input image is shifted towards an intended direction, which is quantified by semantic attributes. Prominent example applications are photo realistic changes of facial features and expressions, like changing the hair color, adding a smile, enlarging the nose or altering the entire context of a scene, like transforming a summer landscape into a winter panorama. Recent advances in attribute transfer are mostly based on generative deep neural networks, using various techniques to manipulate images in the latent space of the generator. In this paper, we present a novel method for the common sub-task of local attribute transfers, where only parts of a face have to be altered in order to achieve semantic changes (e.g. removing a mustache). In contrast to previous methods, where such local changes have been implemented by generating new (global) images, we propose to formulate local attribute transfers as an inpainting problem. Removing and regenerating only parts of images, our Attribute Transfer Inpainting Generative Adversarial Network (ATI-GAN) is able to utilize local context information to focus on the attributes while keeping the background unmodified resulting in visually sound results.
翻译:属性传输是指以这种方式改变图像的任务, 给定输入图像的语义解释被转换为预定的方向, 以语义属性量化。 突出的例子应用是面部特征和表达方式的符合照片现实的变化, 比如改变发色、 添加笑容、 扩大鼻子或改变整个场景背景, 比如将夏季景观转化为冬季全景。 属性传输方面的最新进展主要基于基因化深度神经网络, 使用各种技术在生成器的潜空空间中操纵图像。 在本文中, 我们为本地属性传输的共同子任务展示了一种新颖的方法, 即只有面部部分必须改变才能实现语义变化( 例如删除胡子 ) 。 与以往的方法不同, 在通过生成新的( 全球) 图像来实施本地属性转换时, 我们建议将本地属性传输设计成一个诱人的问题。 我们的属性传输只使用各种技术在生成图像时, 我们的属性转换 配置 Generalization Adversarial 网络( Adversarial AN) 的子进程, 能够利用本地背景生成结果的图像, 同时对本地图像进行图像进行图像修改。