Due to the success of generative flows to model data distributions, they have been explored in inverse problems. Given a pre-trained generative flow, previous work proposed to minimize the 2-norm of the latent variables as a regularization term. The intuition behind it was to ensure high likelihood latent variables that produce the closest restoration. However, high-likelihood latent variables may generate unrealistic samples as we show in our experiments. We therefore propose a solver to directly produce high-likelihood reconstructions. We hypothesize that our approach could make generative flows a general purpose solver for inverse problems. Furthermore, we propose 1 x 1 coupling functions to introduce permutations in a generative flow. It has the advantage that its inverse does not require to be calculated in the generation process. Finally, we evaluate our method for denoising, deblurring, inpainting, and colorization. We observe a compelling improvement of our method over prior works.
翻译:由于向模型数据分布流的基因变异成功,因此对它们进行了反向探讨。鉴于经过事先训练的基因变异,先前曾提议将潜伏变量的2-中值作为正规化术语,尽量减少潜伏变量的2-中值。其背后的直觉是确保极有可能产生最接近恢复的极潜在变量。然而,高似潜在变量可能会产生不切实际的样本,正如我们在实验中所显示的那样。因此,我们建议了一个解答器,直接产生高相似度的重建。我们假设我们的方法可以使基因变异流成为反向问题的一般目的解答器。此外,我们提议了1x1的混合功能,以引入基因变异流。它的优点是,其反面不需要在生成过程中进行计算。最后,我们评估了我们的脱色、脱色、涂色和彩色化方法。我们观察到我们的方法比先前的工作有令人信服的改进。