Domain decomposition methods are widely used and effective in the approximation of solutions to partial differential equations. Yet the optimal construction of these methods requires tedious analysis and is often available only in simplified, structured-grid settings, limiting their use for more complex problems. In this work, we generalize optimized Schwarz domain decomposition methods to unstructured-grid problems, using Graph Convolutional Neural Networks (GCNNs) and unsupervised learning to learn optimal modifications at subdomain interfaces. A key ingredient in our approach is an improved loss function, enabling effective training on relatively small problems, but robust performance on arbitrarily large problems, with computational cost linear in problem size. The performance of the learned linear solvers is compared with both classical and optimized domain decomposition algorithms, for both structured- and unstructured-grid problems.
翻译:在部分差异方程式解决方案的近似化中,内分解方法被广泛使用并有效使用;然而,这些方法的最佳构建需要枯燥的分析,通常只能在简化、结构化的电网环境中使用,限制其用于更复杂的问题。在这项工作中,我们推广了优化的施瓦兹域分解方法,将其转化为无结构化的电网问题,使用图集神经网络(GCNN)和在无监督的学习在子域界面学习最佳修改。我们方法的一个关键要素是改进损失功能,使对相对较小的问题进行有效的培训,但能对任意的大型问题进行强有力的性能培训,而计算成本线性能则在问题大小上。对于结构化和无结构化电网格问题,所学的线性解析方法与传统和优化域分解算法相比,既有结构化的,也有非结构化的电网格问题。