Score-based generative models (SGMs) have recently demonstrated impressive results in terms of both sample quality and distribution coverage. However, they are usually applied directly in data space and often require thousands of network evaluations for sampling. Here, we propose the Latent Score-based Generative Model (LSGM), a novel approach that trains SGMs in a latent space, relying on the variational autoencoder framework. Moving from data to latent space allows us to train more expressive generative models, apply SGMs to non-continuous data, and learn smoother SGMs in a smaller space, resulting in fewer network evaluations and faster sampling. To enable training LSGMs end-to-end in a scalable and stable manner, we (i) introduce a new score-matching objective suitable to the LSGM setting, (ii) propose a novel parameterization of the score function that allows SGM to focus on the mismatch of the target distribution with respect to a simple Normal one, and (iii) analytically derive multiple techniques for variance reduction of the training objective. LSGM obtains a state-of-the-art FID score of 2.10 on CIFAR-10, outperforming all existing generative results on this dataset. On CelebA-HQ-256, LSGM is on a par with previous SGMs in sample quality while outperforming them in sampling time by two orders of magnitude. In modeling binary images, LSGM achieves state-of-the-art likelihood on the binarized OMNIGLOT dataset. Our project page and code can be found at https://nvlabs.github.io/LSGM .
翻译:最近,基于分数的基因化模型(SGM)在样本质量和分布范围方面都显示了令人印象深刻的成果,然而,这些模型通常在数据空间中直接应用,往往需要数千次网络评估才能进行取样。在这里,我们提议了基于LSGM的新办法,即利用变式自动校正框架,在潜藏空间中培训SGM(LSGM),这是一个新的方法,利用变式自动校正框架,在潜藏空间中培训SGM(SGM),从数据转向潜藏空间,使我们能够培训更清晰的基因化模型,将SGM应用于不连续的数据,在较小空间中学习更平滑的SGM(SGM),从而减少网络评价和更快的取样。为了能够以可变和稳定的方式培训LSGM(LGM)的终端到终端,我们(ISGM)的终端到一个新的分数匹配目标,在SLFMS(S-10) 之前的SB-RO-R(S) IM(S) IM(S-R) IM(S) IM(S-R) AS) IM(S) AS) IM(S) IM(S) AS) IM(S-ral-ral-r) AS(S) AS) AS) AS) AS(S) IM(S) AS) AS) (S) (S) (S) AS) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (SBA) (SBA) (SBA) (SBA) (S) (S) (S) (S) (S) (S) (S) (S) (S) (S) (所有 ) (S) (S) ) (S) (S) (S) (S) (S) (