Learning high-dimensional distributions is an important yet challenging problem in machine learning with applications in various domains. In this paper, we introduce new techniques to formulate the problem as solving Fokker-Planck equation in a lower-dimensional latent space, aiming to mitigate challenges in high-dimensional data space. Our proposed model consists of latent-distribution morphing, a generator and a parameterized Fokker-Planck kernel function. One fascinating property of our model is that it can be trained with arbitrary steps of latent distribution morphing or even without morphing, which makes it flexible and as efficient as Generative Adversarial Networks (GANs). Furthermore, this property also makes our latent-distribution morphing an efficient plug-and-play scheme, thus can be used to improve arbitrary GANs, and more interestingly, can effectively correct failure cases of the GAN models. Extensive experiments illustrate the advantages of our proposed method over existing models.
翻译:学习高维分布是一个重要但具有挑战性的问题。 在本文中,我们引入了新技术来将问题发展成在低维潜层中解决Fokker-Planck等式,目的是减轻高维数据空间的挑战。我们提议的模型包括潜分布变形、生成器和参数化Fokker-Planck内核功能。我们模型的一个引人入胜的特性是,它能够通过潜在分布变形甚至不变形的任意步骤来训练,从而使它具有灵活性和效力,就像General Adversarial Networks(GANs)一样。 此外,这种特性还使我们的潜分布变形成为高效的插头和玩耍计划,因此可以用来改进任意的GANs,更有趣的是,能够有效地纠正GAN模型的失败案例。广泛的实验展示了我们拟议方法相对于现有模型的优势。