Continuous Normalizing Flows (CNFs) are a class of generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE). We propose to train CNFs on manifolds by minimizing probability path divergence (PPD), a novel family of divergences between the probability density path generated by the CNF and a target probability density path. PPD is formulated using a logarithmic mass conservation formula which is a linear first order partial differential equation relating the log target probabilities and the CNF's defining vector field. PPD has several key benefits over existing methods: it sidesteps the need to solve an ODE per iteration, readily applies to manifold data, scales to high dimensions, and is compatible with a large family of target paths interpolating pure noise and data in finite time. Theoretically, PPD is shown to bound classical probability divergences. Empirically, we show that CNFs learned by minimizing PPD achieve state-of-the-art results in likelihoods and sample quality on existing low-dimensional manifold benchmarks, and is the first example of a generative model to scale to moderately high dimensional manifolds.
翻译:持续正常流动(CNFs)是一组基因模型,通过解决普通差分方程(ODE)将先前的分布转换成模型分布。我们提议通过将概率路径差异最小化(PPD),即CNF生成的概率密度路径和目标概率密度路径之间差异的新组合,在元体上培训CNF(CNF),这是CNF(CNF)生成的概率和目标密度路径之间的新组合。PPPD(CPD)是使用一个对数质量保护公式制定的,这是一种线性第一线性第一线性部分差异方程,它与日志目标目标目标目标概率概率概率概率概率概率概率和CNF定义矢量字段相关。PPD(CF)对于现有方法有一些关键的好处:它绕过解决一次循环的OD(OD)的需要,它很容易适用于多维度数据、尺度和高维度,并且与在有限时间内将纯噪音和数据相交错的大规模目标路径相容。理论上,PPD(PD)被显示将经典概率差异差差差。我们显示,通过尽量减少PPDDD(PDD)在现有低维数基准上取得最新结果的可能性和样本质量。