Neural operators, which use deep neural networks to approximate the solution mappings of partial differential equation (PDE) systems, are emerging as a new paradigm for PDE simulation. The neural operators could be trained in supervised or unsupervised ways, i.e., by using the generated data or the PDE information. The unsupervised training approach is essential when data generation is costly or the data is less qualified (e.g., insufficient and noisy). However, its performance and efficiency have plenty of room for improvement. To this end, we design a new loss function based on the Feynman-Kac formula and call the developed neural operator Monte-Carlo Neural Operator (MCNO), which can allow larger temporal steps and efficiently handle fractional diffusion operators. Our analyses show that MCNO has advantages in handling complex spatial conditions and larger temporal steps compared with other unsupervised methods. Furthermore, MCNO is more robust with the perturbation raised by the numerical scheme and operator approximation. Numerical experiments on the diffusion equation and Navier-Stokes equation show significant accuracy improvement compared with other unsupervised baselines, especially for the vibrated initial condition and long-time simulation settings.
翻译:神经操作员使用深神经网络接近部分差异方程(PDE)系统的解决方案绘图,这些神经操作员正在作为PDE模拟的新范例出现。神经操作员可以受到监督或不受监督的方式培训,即使用生成的数据或PDE信息。当数据生成费用昂贵或数据不合格(例如,不够和吵闹)时,不受监督的培训方法至关重要。然而,其性能和效率有很大的改进空间。为此,我们根据Feynman-Kac公式设计了新的损失函数,并呼叫发达的神经操作员Monte-Carlo神经操作员(MCNO),这可以允许更大的时间步骤和有效地处理分数扩散操作员。我们的分析表明,MCNO在处理复杂的空间条件和较大的时间步骤方面具有优势,而与其他不受监督的方法(例如,不充分和吵闹杂)相比。此外,MCNO由于数字图和操作员的近似,其性能和效率也有很大改进的余地。为此,我们设计了一个新的损失函数,根据Feynman-Kac公式和Navier-Stokes 等等等成的数值实验显示与其他未经超久模拟的基线和初始状态相比具有显著的精确性。