We introduce Invertible Dense Networks (i-DenseNets), a more parameter efficient extension of Residual Flows. The method relies on an analysis of the Lipschitz continuity of the concatenation in DenseNets, where we enforce invertibility of the network by satisfying the Lipschitz constant. Furthermore, we propose a learnable weighted concatenation, which not only improves the model performance but also indicates the importance of the concatenated weighted representation. Additionally, we introduce the Concatenated LipSwish as activation function, for which we show how to enforce the Lipschitz condition and which boosts performance. The new architecture, i-DenseNet, out-performs Residual Flow and other flow-based models on density estimation evaluated in bits per dimension, where we utilize an equal parameter budget. Moreover, we show that the proposed model out-performs Residual Flows when trained as a hybrid model where the model is both a generative and a discriminative model.
翻译:我们引入了隐性常识网络(i-DenseNets), 这是一种更高效的剩余流动的参数扩展。 这种方法依赖于对DenseNets中连接的利普西茨连续性的分析, 通过满足利普西茨常量,我们通过满足利普西茨常量来强制网络的可视性。 此外, 我们提出一个可学习的加权连接, 这不仅改善了模型的性能, 也表明了连接的加权代表性的重要性。 此外, 我们引入了连接的 LipSwish 功能作为激活功能, 我们为此我们展示了如何强制实施利普西茨条件, 并提升性能。 新的结构、 i- 登西特网、 超形式残余流动 和其他以密度估算为基础的流动模型, 按每维度比特进行评估, 我们在此使用同等参数预算 。 此外, 我们展示了拟议模型在作为混合模型时, 即模型既具有基因化模式又具有歧视模式的模型时, 超越残余流动。