We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators. This result extends the work of Hand and Voroninski from efficient inversion to efficient posterior sampling. In practice, to allow for increased expressivity, we propose to do posterior sampling in the latent space of a pre-trained generative model. To achieve that, we train a score-based model in the latent space of a StyleGAN-2 and we use it to solve inverse problems. Our framework, Score-Guided Intermediate Layer Optimization (SGILO), extends prior work by replacing the sparsity regularization with a generative prior in the intermediate layer. Experimentally, we obtain significant improvements over the previous state-of-the-art, especially in the low measurement regime.
翻译:事实证明,我们快速混合并定性Langevin Algorithm的固定分布,以推翻随机加权 DNN发电机。这一结果将Hand和Voroninski的工作从高效的转换扩展至高效的后部取样。在实践中,为了提高表达性,我们提议在经过训练的基因模型的潜质空间进行后部取样。为了做到这一点,我们在SysteleGAN-2的潜伏空间中培养一个基于分数的模型,并用它解决反向问题。我们的框架,即计分中间层优化(SGILO)扩大了先前的工作,用中间层之前的基因取代了聚变规范。我们实验性地取得了比以前最先进的技术,特别是低测量机制的重大改进。