Whereas diverse variations of diffusion models exist, expanding the linear diffusion into a nonlinear diffusion process is investigated only by a few works. The nonlinearity effect has been hardly understood, but intuitively, there would be more promising diffusion patterns to optimally train the generative distribution towards the data distribution. This paper introduces such a data-adaptive and nonlinear diffusion process for score-based diffusion models. The proposed Implicit Nonlinear Diffusion Model (INDM) learns the nonlinear diffusion process by combining a normalizing flow and a diffusion process. Specifically, INDM implicitly constructs a nonlinear diffusion on the \textit{data space} by leveraging a linear diffusion on the \textit{latent space} through a flow network. This flow network is the key to forming a nonlinear diffusion as the nonlinearity fully depends on the flow network. This flexible nonlinearity is what improves the learning curve of INDM to nearly Maximum Likelihood Estimation (MLE) training, against the non-MLE training of DDPM++, which turns out to be a special case of INDM with the identity flow. Also, training the nonlinear diffusion yields the sampling robustness by the discretization step sizes. In experiments, INDM achieves the state-of-the-art FID on CelebA.
翻译:虽然扩散模型存在各种不同的差异,但将线性扩散扩大到非线性扩散过程只能通过少数作品来调查。非线性效应几乎无法理解,但直观地说,将会有更有希望的传播模式,以最佳的方式将基因分布培训到数据分布上。本文为基于分的传播模型引入了这种数据适应和非线性传播过程。提议的隐含非线性非线性扩散模型(INDM)通过将正常流与扩散过程结合起来学习非线性扩散过程。具体地说,INDMM在\textit{数据空间}上隐含一种非线性传播过程,通过流动网络在\ textit{Latentspace}上进行线性传播。这一流动网络是形成非线性传播过程的关键,因为非线性传播完全取决于流动网络。这种灵活的非线性传播模式是将INDMM(M)的学习曲线提高到接近最接近相似的线性刺激(MLE)培训,而DDPM+(D)是INDM(DNA)的一个特殊案例,通过不可靠、不可靠和稳定化的磁性磁性磁性磁性磁性化的磁性磁性模型的模型流进行。