We study a normalizing flow in the latent space of a top-down generator model, in which the normalizing flow model plays the role of the informative prior model of the generator. We propose to jointly learn the latent space normalizing flow prior model and the top-down generator model by a Markov chain Monte Carlo (MCMC)-based maximum likelihood algorithm, where a short-run Langevin sampling from the intractable posterior distribution is performed to infer the latent variables for each observed example, so that the parameters of the normalizing flow prior and the generator can be updated with the inferred latent variables. We show that, under the scenario of non-convergent short-run MCMC, the finite step Langevin dynamics is a flow-like approximate inference model and the learning objective actually follows the perturbation of the maximum likelihood estimation (MLE). We further point out that the learning framework seeks to (i) match the latent space normalizing flow and the aggregated posterior produced by the short-run Langevin flow, and (ii) bias the model from MLE such that the short-run Langevin flow inference is close to the true posterior. Empirical results of extensive experiments validate the effectiveness of the proposed latent space normalizing flow model in the tasks of image generation, image reconstruction, anomaly detection, supervised image inpainting and unsupervised image recovery.
翻译:我们研究自上而下发电机模型潜伏空间的正常流动,使正常流动模式发挥先发制人之前的知情模型的作用。我们提议由基于Markov链的Monte Carlo(MCMC)最大可能性算法共同学习潜在空间的正常流动前模型和自下而下的发电机模型,该算法基于Markov链的Monte Carlo(Monte Carlo(MCMC)),从棘手的远洋分布中进行短程Langevin取样,以推断每个观察到的示例的潜在变量,从而使得正常流动之前的参数和发电机的组合能够与推断的潜在变量相更新。我们表明,在非一致短程短程运行的短程运行的MMCMMC假设下,定时的Langevin动态是一个类似于流动的大致推算模型,学习目标实际上是在最大可能性估算(MLEE)的干扰下进行。我们进一步指出,学习框架旨在(i)将潜在的空间正常流动与短程流与由短程朗文流生成的图像所生成的汇总成的图像相匹配,以及(ii)对MLE的模型产生模型进行偏向,因此,短程的不连续的兰温流的正常空间图像的图像的模拟的模拟的模拟流动将接近于对正变。