We present new convergence analyses for subspace correction methods for semicoercive and nearly semicoercive convex optimization problems, generalizing the theory of singular and nearly singular linear problems to the nonlinear domain. Our results demonstrate that the elegant theoretical framework developed for singular and nearly singular linear problems can be extended to semicoercive and nearly semicoercive convex optimization problems. For semicoercive problems, we show that the convergence rate can be estimated in terms of a seminorm stable decomposition over the subspaces and the kernel of the problem, aligning with the theory for singular linear problems. For nearly semicoercive problems, we establish a parameter-independent convergence rate, assuming the kernel of the semicoercive part can be decomposed into a sum of local kernels, which aligns with the theory for nearly singular problems. To demonstrate the applicability of our results, we provide convergence analyses of two-level additive Schwarz methods for solving a nonlinear Neumann boundary value problem and its perturbation within the proposed abstract framework.
翻译:暂无翻译