Normalizing flows are a powerful tool for generative modelling, density estimation and posterior reconstruction in Bayesian inverse problems. In this paper, we introduce proximal residual flows, a new architecture of normalizing flows. Based on the fact, that proximal neural networks are by definition averaged operators, we ensure invertibility of certain residual blocks. Moreover, we extend the architecture to conditional proximal residual flows for posterior reconstruction within Bayesian inverse problems. We demonstrate the performance of proximal residual flows on numerical examples.
翻译:正常化流是巴伊西亚反面问题基因建模、密度估计和后期重建的有力工具。 在本文中,我们引入了近似剩余流,这是正常流的新结构。基于以下事实,即根据定义,近似神经网络是平均操作者,我们确保某些残余块的可视性。此外,我们将该结构扩展至在巴伊西亚反面问题中有条件的准剩余流,用于亚贝亚反面问题的后期重建。我们用数字实例展示了近似剩余流的性能。