We introduce a new paradigm for generative modeling built on Continuous Normalizing Flows (CNFs), allowing us to train CNFs at unprecedented scale. Specifically, we present the notion of Flow Matching (FM), a simulation-free approach for training CNFs based on regressing vector fields of fixed conditional probability paths. Flow Matching is compatible with a general family of Gaussian probability paths for transforming between noise and data samples -- which subsumes existing diffusion paths as specific instances. Interestingly, we find that employing FM with diffusion paths results in a more robust and stable alternative for training diffusion models. Furthermore, Flow Matching opens the door to training CNFs with other, non-diffusion probability paths. An instance of particular interest is using Optimal Transport (OT) displacement interpolation to define the conditional probability paths. These paths are more efficient than diffusion paths, provide faster training and sampling, and result in better generalization. Training CNFs using Flow Matching on ImageNet leads to state-of-the-art performance in terms of both likelihood and sample quality, and allows fast and reliable sample generation using off-the-shelf numerical ODE solvers.
翻译:我们引入了基于持续正常化流动(CNFs)的基因模型的新模式,让我们能够以前所未有的规模对CNF进行培训。 具体地说,我们展示了流动匹配(FM)的概念,这是一个基于固定有条件概率路径的递减矢量场对CNF进行培训的模拟无污染方法。 流动匹配与Gausian概率路径在噪音和数据样本之间转换的一般组合相容,它将现有传播路径作为具体实例进行分解。 有趣的是,我们发现使用FM(FM)将传播路径转化为培训推广模型的更稳健和稳定的替代方法。 此外,流动匹配打开了培训CNF的大门,用其他非扩散概率路径对CNF进行培训。 一个特别令人感兴趣的实例是利用Otimal(OT)迁移)来定义有条件概率路径。 这些路径比扩散路径更有效,提供更快的培训和取样,并导致更好的普及。 我们发现,使用图像网络的流匹配方法培训CNFFPs导致在可能性和样本质量方面达到最先进的表现。 并且允许使用离Self数字的数解码解码解算器进行快速和可靠的抽样生成。