Continuous normalizing flows (CNFs) are an attractive generative modeling technique, but they have thus far been held back by limitations in their simulation-based maximum likelihood training. In this paper, we introduce a new technique called conditional flow matching (CFM), a simulation-free training objective for CNFs. CFM features a stable regression objective like that used to train the stochastic flow in diffusion models but enjoys the efficient inference of deterministic flow models. In contrast to both diffusion models and prior CNF training algorithms, our CFM objective does not require the source distribution to be Gaussian or require evaluation of its density. Based on this new objective, we also introduce optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference, as evaluated in our experiments. Training CNFs with CFM improves results on a variety of conditional and unconditional generation tasks such as inferring single cell dynamics, unsupervised image translation, and Schr\"odinger bridge inference. Code is available at https://github.com/atong01/conditional-flow-matching .
翻译:连续的正常流动(CNF)是一种有吸引力的基因模型技术,但迄今为止,由于模拟最大可能性培训的限制,这些流动一直受到限制。在本文中,我们引入了一种名为有条件流动匹配(CFM)的新技术,这是CNF的模拟培训目标。CFM具有一个稳定回归目标,就像在扩散模型中培训随机流动那样,但享有确定性流动模型的有效推断力。与扩散模型和先前的CNF培训算法不同,我们的CFM目标并不要求来源分布为高斯或要求对其密度进行评估。基于这一新目标,我们还引入了最优化的运输CFM(OT-CFM)技术,它创造了更稳定的流动,以培训和导致更快的推断,正如我们在实验中评估的那样。用CFM培训C改善了各种有条件和无条件的生成任务的结果,如推断单细胞动态、未加固图像翻译和Schr\"adinger brigenerence。在https://githbub.com/ongastast01/stropal-traction.