We introduce a new paradigm for generative modeling built on Continuous Normalizing Flows (CNFs), allowing us to train CNFs at unprecedented scale. Specifically, we present the notion of Flow Matching (FM), a simulation-free approach for training CNFs based on regressing vector fields of fixed conditional probability paths. Flow Matching is compatible with a general family of Gaussian probability paths for transforming between noise and data samples -- which subsumes existing diffusion paths as specific instances. Interestingly, we find that employing FM with diffusion paths results in a more robust and stable alternative for training diffusion models. Furthermore, Flow Matching opens the door to training CNFs with other, non-diffusion probability paths. An instance of particular interest is using Optimal Transport (OT) displacement interpolation to define the conditional probability paths. These paths are more efficient than diffusion paths, provide faster training and sampling, and result in better generalization. Training CNFs using Flow Matching on ImageNet leads to consistently better performance than alternative diffusion-based methods in terms of both likelihood and sample quality, and allows fast and reliable sample generation using off-the-shelf numerical ODE solvers.
翻译:我们引入了基于持续正常化流动(CNFs)的基因建模新模式,让我们能够以前所未有的规模对CNF进行培训。 具体地说,我们提出了流动匹配(FM)的概念,这是一个基于固定有条件概率路径的递减矢量场对CNF进行培训的模拟无污染方法。 流动匹配与Gausian概率路径在噪音和数据样本之间转变的一般组合相容 -- -- 将现有传播路径归入具体实例。有趣的是,我们发现使用FM(FM)将传播路径作为培训推广模型的更强有力和稳定的替代方法。 此外,流动匹配为培训CNFF提供了以其他非扩散概率路径培训其他CNF打开的大门。一个特别感兴趣的实例是利用Otimal(OT)迁移)间推法来界定有条件概率路径。这些路径比传播路径更有效,提供了更快的培训和取样,并导致更好的普及。 利用图像网上的流匹配方法培训CNFFFS在可能性和样本质量方面比基于其他传播方法的一贯更好表现。