Physics-Informed Neural Networks (PINNs) have enabled significant improvements in modelling physical processes described by partial differential equations (PDEs). PINNs are based on simple architectures, and learn the behavior of complex physical systems by optimizing the network parameters to minimize the residual of the underlying PDE. Current network architectures share some of the limitations of classical numerical discretization schemes when applied to non-linear differential equations in continuum mechanics. A paradigmatic example is the solution of hyperbolic conservation laws that develop highly localized nonlinear shock waves. Learning solutions of PDEs with dominant hyperbolic character is a challenge for current PINN approaches, which rely, like most grid-based numerical schemes, on adding artificial dissipation. Here, we address the fundamental question of which network architectures are best suited to learn the complex behavior of non-linear PDEs. We focus on network architecture rather than on residual regularization. Our new methodology, called Physics-Informed Attention-based Neural Networks, (PIANNs), is a combination of recurrent neural networks and attention mechanisms. The attention mechanism adapts the behavior of the deep neural network to the non-linear features of the solution, and break the current limitations of PINNs. We find that PIANNs effectively capture the shock front in a hyperbolic model problem, and are capable of providing high-quality solutions inside and beyond the training set.
翻译:以部分差异方程式(PDEs)描述的模拟物理过程的建模有了显著的改进。 PINN以简单的结构为基础,通过优化网络参数来最大限度地减少基本PDE的剩余部分来学习复杂的物理系统的行为。当前网络结构在应用非线性差异方程式时分享了经典数字分化方案的一些局限性。一个典型的例子就是发展高度本地化的非线性冲击波的双曲线保护法的解决方案。具有主导性超双向特性的PDE的学习解决方案是当前PINN方法的挑战,这些方法与大多数基于电网的数字计划一样,都依赖于增加人工消散。在这里,我们处理的是哪些网络结构最适合于学习非线性PDE的复杂行为的基本问题。我们侧重于网络结构而不是残余性规范。我们的新方法,称为物理成型关注型神经网络(PIANNs),是经常的神经网络网络和关注机制的组合。关注机制,它像大多数基于网基数字的网络内部结构,能够改变PIAN顶级的内测测测度,我们找到了一个不的顶级的顶级网络。