The analysis of the acceleration behavior of gradient-based eigensolvers with preconditioning presents a substantial theoretical challenge. In this work, we present a novel framework for preconditioning on Riemannian manifolds and introduce a metric, the leading angle, to evaluate preconditioners for symmetric eigenvalue problems. We extend the locally optimal Riemannian accelerated gradient method for Riemannian convex optimization to develop the Riemannian Acceleration with Preconditioning (RAP) method for symmetric eigenvalue problems, thereby providing theoretical evidence to support its acceleration. Our analysis of the Schwarz preconditioner for elliptic eigenvalue problems demonstrates that RAP achieves a convergence rate of $1-C\kappa^{-1/2}$, which is an improvement over the preconditioned steepest descent method's rate of $1-C\kappa^{-1}$. The exponent in $\kappa^{-1/2}$ is sharp, and numerical experiments confirm our theoretical findings.
翻译:暂无翻译