Learning partial differential equations' (PDEs) solution operators is an essential problem in machine learning. However, there are several challenges for learning operators in practical applications like the irregular mesh, multiple input functions, and complexity of the PDEs' solution. To address these challenges, we propose a general neural operator transformer (GNOT), a scalable and effective transformer-based framework for learning operators. By designing a novel heterogeneous normalized attention layer, our model is highly flexible to handle multiple input functions and irregular mesh. Besides, we introduce a geometric gating mechanism which could be viewed as a soft domain decomposition to solve the multi-scale problems. The large model capacity of transformer architecture grants our model the possibility to scale to large datasets and practical problems. We conduct extensive experiments on multiple challenging datasets from different domains and achieve a remarkable improvement compared with alternative methods.
翻译:学习部分方程式( PDEs) 解决方案操作员是机器学习的一个基本问题。 但是,学习操作员在诸如非常规网格、多重输入功能和PDEs解决方案的复杂性等实际应用方面面临着若干挑战。 为了应对这些挑战,我们提议为学习操作员设计一个通用神经操作器变压器(GNOT),这是一个可缩放的和有效的基于变压器的框架。通过设计一个新颖的异同式的正常关注层,我们的模型非常灵活,可以处理多种输入功能和不规则网格。此外,我们引入了几何格化机制,可以被视为解决多尺度问题的软域分解器。巨型变压器结构的模型能力使我们的模型有可能将规模扩大到大型数据集和实际问题。我们从不同领域对多重具有挑战性的数据集进行了广泛的实验,并与替代方法相比,我们取得了显著的改进。</s>