Modeling dynamics in the form of partial differential equations (PDEs) is an effectual way to understand real-world physics processes. For complex physics systems, analytical solutions are not available and numerical solutions are widely-used. However, traditional numerical algorithms are computationally expensive and challenging in handling multiphysics systems. Recently, using neural networks to solve PDEs has made significant progress, called physics-informed neural networks (PINNs). PINNs encode physical laws into neural networks and learn the continuous solutions of PDEs. For the training of PINNs, existing methods suffer from the problems of inefficiency and unstable convergence, since the PDE residuals require calculating automatic differentiation. In this paper, we propose Dynamic Mesh-based Importance Sampling (DMIS) to tackle these problems. DMIS is a novel sampling scheme based on importance sampling, which constructs a dynamic triangular mesh to estimate sample weights efficiently. DMIS has broad applicability and can be easily integrated into existing methods. The evaluation of DMIS on three widely-used benchmarks shows that DMIS improves the convergence speed and accuracy in the meantime. Especially in solving the highly nonlinear Schr\"odinger Equation, compared with state-of-the-art methods, DMIS shows up to 46% smaller root mean square error and five times faster convergence speed. Code are available at https://github.com/MatrixBrain/DMIS.
翻译:以部分差异方程式(PDEs)的形式建模动态是理解现实世界物理过程的一种实用方法。对于复杂的物理学系统,没有分析解决办法,数字解决办法被广泛使用。然而,传统的数字算法在处理多物理学系统方面计算成本昂贵且具有挑战性。最近,利用神经网络解决PDEs取得了显著进展,称为物理知情神经网络(PINNS),PINNs将物理法纳入神经网络,并学习PDEs的持续解决办法。对于PINNs的培训,现有方法存在效率不高和不稳定的趋同问题,因为PDE的剩余方法需要自动计算差异。在本文中,我们建议采用基于动态Mesh(Sech)的强度取样(DMIS)来解决这些问题。DMIS是一种基于重要取样的新型抽样方案,它构建了一个动态三角模型来有效估计样本重量。DMIS具有广泛适用性,并且可以很容易融入现有方法。DMIS的三个广泛使用的基准评估表明DMIS的趋同速度和精确性,因为PMIS的剩余部分需要自动计算。特别的S-qrentrentral ax axal drode, 和Syrationaldral drode 。特别的S-raldxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx