Regularization is a long-standing challenge for ill-posed linear inverse problems, and a prototype is the Fredholm integral equation of the first kind with additive Gaussian measurement noise. We introduce a new RKHS regularization adaptive to measurement data and the underlying linear operator. This RKHS arises naturally in a variational approach, and its closure is the function space in which we can identify the true solution. Also, we introduce a small noise analysis to compare regularization norms by sharp convergence rates in the small noise limit. Our analysis shows that the RKHS- and $L^2$-regularizers yield the same convergence rate when their optimal hyper-parameters are selected using the true solution, and the RKHS-regularizer has a smaller multiplicative constant. However, in computational practice, the RKHS regularizer significantly outperforms the $L^2$-and $l^2$-regularizers in producing consistently converging estimators when the noise level decays or the observation mesh refines.
翻译:暂无翻译