We present a novel generative modeling method called diffusion normalizing flow based on stochastic differential equations (SDEs). The algorithm consists of two neural SDEs: a forward SDE that gradually adds noise to the data to transform the data into Gaussian random noise, and a backward SDE that gradually removes the noise to sample from the data distribution. By jointly training the two neural SDEs to minimize a common cost function that quantifies the difference between the two, the backward SDE converges to a diffusion process the starts with a Gaussian distribution and ends with the desired data distribution. Our method is closely related to normalizing flow and diffusion probabilistic models and can be viewed as a combination of the two. Compared with normalizing flow, diffusion normalizing flow is able to learn distributions with sharp boundaries. Compared with diffusion probabilistic models, diffusion normalizing flow requires fewer discretization steps and thus has better sampling efficiency. Our algorithm demonstrates competitive performance in both high-dimension data density estimation and image generation tasks.
翻译:我们提出了一个新型的基因模型方法,叫做基于随机差异方程式(SDEs)的传播正常流。算法包括两个神经SDEs:一个前方SDE,在数据中逐渐增加噪音,将数据转换成高斯随机噪音,另一个后方SDE,从数据分布中逐渐去除噪音,从数据分布中样本。通过联合培训两个神经SDEs,最大限度地减少共同成本功能,以量化二者之间的差异,后方SDEs将开始于高斯分布过程,结束于理想的数据分布。我们的方法与正常流和扩散概率模型密切相关,可被视为两者的组合。与正常流相比,扩散正常流能够以清晰的边界学习分布。与扩散概率模型相比,扩散正常流需要较少的离散步骤,从而具有更好的取样效率。我们的算法显示高倍化数据密度估计和图像生成任务具有竞争性性。