We consider the efficient numerical minimization of Tikhonov functionals with nonlinear operators and non-smooth and non-convex penalty terms, which appear for example in variational regularization. For this, we consider a new class of SCD semismooth$^*$ Newton methods, which are based on a novel concept of graphical derivatives, and exhibit locally superlinear convergence. We present a detailed description of these methods, and provide explicit algorithms in the case of sparsity and total-variation penalty terms. The numerical performance of these methods is then illustrated on a number of tomographic imaging problems.
翻译:暂无翻译