Normalizing flows are a popular class of models for approximating probability distributions. However, their invertible nature limits their ability to model target distributions with a complex topological structure, such as Boltzmann distributions. Several procedures have been proposed to solve this problem but many of them sacrifice invertibility and, thereby, tractability of the log-likelihood as well as other desirable properties. To address these limitations, we introduce a base distribution for normalizing flows based on learned rejection sampling, allowing the resulting normalizing flow to model complex topologies without giving up bijectivity. Furthermore, we develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence, and apply them to various sample problems, i.e.\ approximating 2D densities, density estimation of tabular data, image generation, and modeling Boltzmann distributions. In these experiments our method is competitive with or outperforms the baselines.
翻译:正常化流是接近概率分布的流行模型类别。 但是,它们的不可逆性限制了它们以复杂的地形结构(如波尔茨曼分布)模拟目标分布的能力。 已经提出了几项程序来解决这个问题, 但其中许多程序却牺牲了不可翻转性, 从而牺牲了日志相似度的可移动性以及其他可取属性。 为了解决这些局限性, 我们引入了基于学习拒绝抽样的正常流的基础分布, 从而允许由此而来的正常流到模型复杂的地形, 而不放弃双向性。 此外, 我们开发了合适的学习算法, 利用最大化日志相似性和优化反向 Kullback- Leiber 差异, 并将其应用于各种抽样问题, 即近似 2D 密度、 表格数据的密度估计、 图像生成和 Boltzmann 分布模型。 在这些实验中, 我们的方法与基线具有竞争力, 或超过基线 。