Normalizing flows are bijective mappings between inputs and latent representations with a fully factorized distribution. They are very attractive due to exact likelihood evaluation and efficient sampling. However, their effective capacity is often insufficient since the bijectivity constraint limits the model width. We address this issue by incrementally padding intermediate representations with noise. We precondition the noise in accordance with previous invertible units, which we describe as cross-unit coupling. Our invertible glow-like modules express intra-unit affine coupling as a fusion of a densely connected block and Nystr\"om self-attention. We refer to our architecture as DenseFlow since both cross-unit and intra-unit couplings rely on dense connectivity. Experiments show significant improvements due to the proposed contributions, and reveal state-of-the-art density estimation among all generative models under moderate computing budgets.
翻译:正常化流动是投入和潜在代表之间的双向分布图,具有充分因素分布,由于准确的概率评估和高效的抽样,它们非常吸引人。然而,由于双向限制限制模型宽度,它们的有效能力往往不足。我们通过以噪音逐步地划入中间代表来解决这个问题。我们把噪音作为噪音的前提条件,以先前的倒置单位为基础,我们称之为跨单位联结。我们无法视而不见的光像模件表示单位内部接合为密集连接区块和Nystr\'om自我注意的结合。我们称我们的建筑为DenseFlow,因为跨单位和单位内部的联动都依赖密集的连接。实验显示,由于拟议的贡献而有很大改善,在中度计算预算下,在所有基因模型中显示最新密度估计。