In this work, we present and study Continuous Generative Neural Networks (CGNNs), namely, generative models in the continuous setting. The architecture is inspired by DCGAN, with one fully connected layer, several convolutional layers and nonlinear activation functions. In the continuous $L^2$ setting, the dimensions of the spaces of each layer are replaced by the scales of a multiresolution analysis of a compactly supported wavelet. We present conditions on the convolutional filters and on the nonlinearity that guarantee that a CGNN is injective. This theory finds applications to inverse problems, and allows for deriving Lipschitz stability estimates for (possibly nonlinear) infinite-dimensional inverse problems with unknowns belonging to the manifold generated by a CGNN. Several numerical simulations, including image deblurring, illustrate and validate this approach.
翻译:在这项工作中,我们介绍并研究连续生成神经网络(CGNNs),即连续环境中的基因模型,建筑是由DCGAN所启发的,它有一个完全连接的层、若干进化层和非线性激活功能。在连续的$L$2美元设置中,每个层空间的尺寸由对压实支持的波盘进行多分辨率分析的尺度所取代。我们展示了革命过滤器和非线性的条件,保证CGN是注射的。这一理论发现对反问题的应用,并允许对CGNN产生的多元的未知物(可能是非线性)产生无限的反向问题得出Lipschitz稳定性估计数。包括图像分流、说明和验证这一方法的若干数字模拟。