We propose inexact subspace iteration for solving high-dimensional eigenvalue problems with low-rank structure. Inexactness stems from low-rank compression, enabling efficient representation of high-dimensional vectors in a low-rank tensor format. A primary challenge in these methods is that standard operations, such as matrix-vector products and linear combinations, increase tensor rank, necessitating rank truncation and hence approximation. We compare the proposed methods with an existing inexact Lanczos method with low-rank compression. This method constructs an approximate orthonormal Krylov basis, which is often difficult to represent accurately in low-rank tensor formats, even when the eigenvectors themselves exhibit low-rank structure. In contrast, inexact subspace iteration uses approximate eigenvectors (Ritz vectors) directly as a subspace basis, bypassing the need for an orthonormal Krylov basis. Our analysis and numerical experiments demonstrate that inexact subspace iteration is much more robust to rank-truncation errors compared to the inexact Lanczos method. We also demonstrate that rank-truncated subspace iteration can converge for problems where the DMRG method stagnates. Furthermore, the proposed subspace iteration methods do not require a Hermitian matrix, in contrast to Lanczos and DMRG, which are designed specifically for Hermitian matrices.
翻译:暂无翻译