Matrix splitting iteration methods play a vital role in solving large sparse linear systems. Their performance heavily depends on the splitting parameters, however, the approach of selecting optimal splitting parameters has not been well developed. In this paper, we present a multitask kernel-learning parameter prediction method to automatically obtain relatively optimal splitting parameters, which contains simultaneous multiple parameters prediction and a data-driven kernel learning. For solving time-dependent linear systems, including linear differential systems and linear matrix systems, we give a new matrix splitting Kronecker product method, as well as its convergence analysis and preconditioning strategy. Numerical results illustrate our methods can save an enormous amount of time in selecting the relatively optimal splitting parameters compared with the exists methods. Moreover, our iteration method as a preconditioner can effectively accelerate GMRES. As the dimension of systems increases, all the advantages of our approaches becomes significantly. Especially, for solving the differential Sylvester matrix equation, the speedup ratio can reach tens to hundreds of times when the scale of the system is larger than one hundred thousand.
翻译:矩阵分层迭代方法在解决大量稀疏线性系统方面发挥着关键作用。 但是,它们的性能在很大程度上取决于分层参数,但是,选择最佳分解参数的方法还没有很好地发展。 在本文件中,我们提出了一个多任务内核学习参数预测方法,以自动获得相对最佳的分解参数,其中包括同时的多个参数预测和数据驱动的内核学习。为了解决时间依赖线性系统,包括线性差分系统和线性矩阵系统,我们给出一个新的矩阵分解克罗内尔产品方法及其趋同分析和先决条件战略。数字结果表明,我们的方法可以节省大量时间选择相对最佳的分解参数,而与现有方法相比。此外,我们作为先决条件的迭代方法可以有效地加速GMRES。随着系统层面的扩大,我们方法的所有优势都变得非常显著。特别是,在解决差别的Sylvester矩阵方程时,当系统的规模大于10万时,加速率可以达到数十万至数百倍。