Energy-Based Models (EBMs) are known in the Machine Learning community for the decades. Since the seminal works devoted to EBMs dating back to the noughties there have been appearing a lot of efficient methods which solve the generative modelling problem by means of energy potentials (unnormalized likelihood functions). In contrast, the realm of Optimal Transport (OT) and, in particular, neural OT solvers is much less explored and limited by few recent works (excluding WGAN based approaches which utilize OT as a loss function and do not model OT maps themselves). In our work, we bridge the gap between EBMs and Entropy-regularized OT. We present the novel methodology which allows utilizing the recent developments and technical improvements of the former in order to enrich the latter. We validate the applicability of our method on toy 2D scenarios as well as standard unpaired image-to-image translation problems. For the sake of simplicity, we choose simple short- and long- run EBMs as a backbone of our Energy-guided Entropic OT method, leaving the application of more sophisticated EBMs for future research.
翻译:能量模型(EBM)在机器学习界已有数十年的知名度。自二千年代初期的早期EBM着作以来,已经出现了许多通过能量势(未归一化的似然函数)解决生成建模问题的有效方法。相比之下,最优输运(OT)领域,特别是神经OT求解器方面,研究相对较少,仅有少数最近的作品(不包括使用OT作为损失函数并未对OT映射进行建模的WGAN方法)。 在我们的工作中,我们弥合了EBM和熵正则化OT之间的差距。我们提出了一种新的方法,允许利用前者的最新发展和技术改进来丰富后者。我们验证了我们方法在玩具2D场景和标准的非配对图像到图像翻译问题上的适用性。为了简便起见,我们选择简单的短程和长程EBM作为我们能量引导的熵式OT方法的骨干,将更复杂的EBM的应用留给未来的研究。