Based on the variational method, we propose a novel paradigm that provides a unified framework of training neural operators and solving partial differential equations (PDEs) with the variational form, which we refer to as the variational operator learning (VOL). We first derive the functional approximation of the system from the node solution prediction given by neural operators, and then conduct the variational operation by automatic differentiation, constructing a forward-backward propagation loop to derive the residual of the linear system. One or several update steps of the steepest decent method (SD) and the conjugate gradient method (CG) are provided in every iteration as a cheap yet effective update for training the neural operators. Experimental results show the proposed VOL can learn a variety of solution operators in PDEs of the steady heat transfer and the variable stiffness elasticity with satisfactory results and small error. The proposed VOL achieves nearly label-free training. Only five to ten labels are used for the output distribution-shift session in all experiments. Generalization benefits of the VOL are investigated and discussed.
翻译:Translated abstract:
基于变分方法,我们提出了一种新的范式,通过变分形式为神经算子的训练和偏微分方程(PDE)的求解提供了统一的框架,我们将其称为变分算子学习(VOL)。我们首先从神经算子给出的节点解预测推导出系统的函数逼近,然后通过自动微分进行变分运算,构建正向-反向传播环路以推导出线性系统的残差。在每次迭代中,我们提供了最陡下降法(SD)和共轭梯度法(CG)的一两个更新步骤,作为训练神经算子的廉价而有效的更新方法。实验结果表明,所提出的VOL可以学习到在稳态热传递和变刚度弹性方程中的各种解算子,且具有令人满意的结果和小误差。所提出的VOL近乎实现了无标签训练,所有实验中仅使用了5至10个标签进行输出分布转移会话。我们还研究并讨论了VOL的泛化效益。