Despite the great promise of the physics-informed neural networks (PINNs) in solving forward and inverse problems, several technical challenges are present as roadblocks for more complex and realistic applications. First, most existing PINNs are based on point-wise formulation with fully-connected networks to learn continuous functions, which suffer from poor scalability and hard boundary enforcement. Second, the infinite search space over-complicates the non-convex optimization for network training. Third, although the convolutional neural network (CNN)-based discrete learning can significantly improve training efficiency, CNNs struggle to handle irregular geometries with unstructured meshes. To properly address these challenges, we present a novel discrete PINN framework based on graph convolutional network (GCN) and variational structure of PDE to solve forward and inverse partial differential equations (PDEs) in a unified manner. The use of a piecewise polynomial basis can reduce the dimension of search space and facilitate training and convergence. Without the need of tuning penalty parameters in classic PINNs, the proposed method can strictly impose boundary conditions and assimilate sparse data in both forward and inverse settings. The flexibility of GCNs is leveraged for irregular geometries with unstructured meshes. The effectiveness and merit of the proposed method are demonstrated over a variety of forward and inverse computational mechanics problems governed by both linear and nonlinear PDEs.
翻译:尽管物理学-知情神经网络(PINNs)在解决前方和反面问题方面大有希望,但一些技术挑战作为更复杂和现实应用的路障而存在。首先,大多数现有的PINN是建立在与完全连通的网络进行精准设计的基础上,以便学习连续功能,这些功能因伸缩性差和严格的边界执法而受到损害。第二,无限的搜索空间过于复杂,使网络培训的非通量优化变得复杂。第三,尽管基于进化神经网络(CNN)的离散学习能够大大提高培训效率,但CNNs很难用非结构化的网目处理非结构化的不规则的地理不对称的地理特征。为了适当应对这些挑战,我们提出了一个基于图形形相相连的网络(GCN)和PDE的变异结构的新的离异 PINN框架,以统一的方式解决前方和反偏偏偏偏偏偏偏偏偏偏偏的等式(PDEs)。使用零边基础可以减少搜索空间的范围,便利培训和融合。在典型的PINNs中不需要调整刑罚参数,拟议的方法可以严格地将非纵向边界条件和前向结构中位结构中不固定的模型中的数据与递化。