We introduce Projected Latent Markov Chain Monte Carlo (PL-MCMC), a technique for sampling from the high-dimensional conditional distributions learned by a normalizing flow. We prove that a Metropolis-Hastings implementation of PL-MCMC asymptotically samples from the exact conditional distributions associated with a normalizing flow. As a conditional sampling method, PL-MCMC enables Monte Carlo Expectation Maximization (MC-EM) training of normalizing flows from incomplete data. Through experimental tests applying normalizing flows to missing data tasks for a variety of data sets, we demonstrate the efficacy of PL-MCMC for conditional sampling from normalizing flows.
翻译:我们引入了预测的中层马尔科夫链条蒙特卡洛(PL-MC ), 这是一种从通过正常流学到的高维有条件分布中取样的技术。 我们证明大都会对与正常流有关的确切的有条件分布中PL-MC MMC进行零星采样。 作为有条件的采样方法,PL-MC 使蒙特卡洛期望最大化(MC-EM)能够从不完整的数据中进行正常流的培训。 通过对各种数据集的缺失数据任务进行正常流的实验测试,我们展示了PL-MC 在正常流中进行有条件采样的效果。