Energy-based models (EBMs) exhibit a variety of desirable properties in predictive tasks, such as generality, simplicity and compositionality. However, training EBMs on high-dimensional datasets remains unstable and expensive. In this paper, we present a Manifold EBM (M-EBM) to boost the overall performance of unconditional EBM and Joint Energy-based Model (JEM). Despite its simplicity, M-EBM significantly improves unconditional EBMs in training stability and speed on a host of benchmark datasets, such as CIFAR10, CIFAR100, CelebA-HQ, and ImageNet 32x32. Once class labels are available, label-incorporated M-EBM (M-JEM) further surpasses M-EBM in image generation quality with an over 40% FID improvement, while enjoying improved accuracy. The code can be found at https://github.com/sndnyang/mebm.
翻译:以能源为基础的模型(EBM)在预测性任务中表现出了各种可取的特性,如通用性、简洁性和构成性,然而,关于高维数据集的培训EBM仍然不稳定和昂贵,在本文件中,我们提出一个M-EBM(M-EBM),以提高无条件EBM和联合能源模型的总体性能,尽管它简单,但M-EBM在培训稳定性和一系列基准数据集的速度方面大大改进了无条件的EBM,如CIFAR10、CIFAR100、CelibA-HQ和图像网络32x32。一旦分类标签到位,在图像生成质量方面,M-EBM(M-JEM)的标签就进一步超过M-EBM,而FT改进了40%以上,同时提高了准确性。该代码可在https://github.com/snnyang/mebm查阅。</s>