Normalizing flows have shown great success as general-purpose density estimators. However, many real world applications require the use of domain-specific knowledge, which normalizing flows cannot readily incorporate. We propose embedded-model flows(EMF), which alternate general-purpose transformations with structured layers that embed domain-specific inductive biases. These layers are automatically constructed by converting user-specified differentiable probabilistic models into equivalent bijective transformations. We also introduce gated structured layers, which allow bypassing the parts of the models that fail to capture the statistics of the data. We demonstrate that EMFs can be used to induce desirable properties such as multimodality, hierarchical coupling and continuity. Furthermore, we show that EMFs enable a high performance form of variational inference where the structure of the prior model is embedded in the variational architecture. In our experiments, we show that this approach outperforms state-of-the-art methods in common structured inference problems.
翻译:然而,许多现实世界应用都要求使用特定领域的知识,而这种知识不能轻易地实现流动的正常化。我们建议采用嵌入式流动(EMF),这种流动具有结构化层次的替代通用变换,并嵌入特定领域的感应偏差。这些变迁是通过将用户指定的不同概率模型转换为等效的双向变换而自动构造的。我们还引入了封闭式结构化结构化结构化结构化结构化结构,从而可以绕过无法捕捉数据统计数据的模型部分。我们证明,EMF可用于产生理想的特性,如多式联运、分级组合和连续性。此外,我们表明,当先前模型的结构嵌入变异结构时,EMF能够产生一种高性能的变异推断形式。在我们的实验中,我们表明,这种方法在共同结构化推论问题中优于最先进的方法。