Multigrid preconditioners are one of the most powerful techniques for solving large sparse linear systems. In this research, we address Darcy flow problems with random permeability using the conjugate gradient method, enhanced by a two-grid preconditioner based on a generalized multiscale prolongation operator, which has been demonstrated to be stable for high contrast profiles. To circumvent the need for repeatedly solving spectral problems with varying coefficients, we harness deep learning techniques to expedite the construction of the generalized multiscale prolongation operator. Considering linear transformations on multiscale basis have no impact on the performance of the preconditioner, we devise a loss function by the coefficient-based distance between subspaces instead of $l^2$-norm of the difference of the corresponding multiscale bases. We discover that leveraging the inherent symmetry in the local spectral problem can effectively accelerate the neural network training process. In scenarios where training data are limited, we utilize the Karhunen-Lo\`eve expansion to augment the dataset. Extensive numerical experiments with various types of random coefficient models are exhibited, showing that the proposed method can significantly reduce the time required to generate the prolongation operator while maintaining the original efficiency of the two-grid preconditioner.
翻译:暂无翻译