Deep learning based reconstruction methods deliver outstanding results for solving inverse problems and are therefore becoming increasingly important. A recently invented class of learning-based reconstruction methods is the so-called NETT (for Network Tikhonov Regularization), which contains a trained neural network as regularizer in generalized Tikhonov regularization. The existing analysis of NETT considers fixed operator and fixed regularizer and analyzes the convergence as the noise level in the data approaches zero. In this paper, we extend the frameworks and analysis considerably to reflect various practical aspects and take into account discretization of the data space, the solution space, the forward operator and the neural network defining the regularizer. We show the asymptotic convergence of the discretized NETT approach for decreasing noise levels and discretization errors. Additionally, we derive convergence rates and present numerical results for a limited data problem in photoacoustic tomography.
翻译:基于深层次学习的重建方法为解决反面问题提供了杰出的成果,因此越来越重要。最近发明的基于学习的重建方法的一类是所谓的NETT(网络Tikhonov正规化),它包含一个训练有素的神经网络,作为普遍的Tikhonov正规化的正规化的正规化机制。对NET的现有分析认为固定操作员和固定常规化系统是固定操作员和固定常规化系统,并将趋同作为数据零接近的噪音水平来分析。在本文件中,我们大量扩展框架和分析,以反映各种实际方面,并考虑到数据空间、解决方案空间、前方操作员和神经网络的离散性。我们展示了离散的NETT方法在降低噪音水平和离散化错误方面的无意义的趋同。此外,我们为光声摄影学中有限的数据问题得出了趋同率和数字结果。