Regular physics-informed neural networks (PINNs) predict the solution of partial differential equations using sparse labeled data but only over a single domain. On the other hand, fully supervised learning models are first trained usually over a few thousand domains with known solutions (i.e., labeled data) and then predict the solution over a few hundred unseen domains. Physics-informed PointNet (PIPN) is primarily designed to fill this gap between PINNs (as weakly supervised learning models) and fully supervised learning models. In this article, we demonstrate that PIPN predicts the solution of desired partial differential equations over a few hundred domains simultaneously, while it only uses sparse labeled data. This framework benefits fast geometric designs in the industry when only sparse labeled data are available. Particularly, we show that PIPN predicts the solution of a plane stress problem over more than 500 domains with different geometries, simultaneously. Moreover, we pioneer implementing the concept of remarkable batch size (i.e., the number of geometries fed into PIPN at each sub-epoch) into PIPN. Specifically, we try batch sizes of 7, 14, 19, 38, 76, and 133. Additionally, the effect of the PIPN size, symmetric function in the PIPN architecture, and static and dynamic weights for the component of the sparse labeled data in the loss function are investigated.
翻译:传统的物理信息神经网络(PINNs)将部分微分方程的解预测为使用稀疏标记数据但仅在单个域上实现。而完全的监督学习模型则通常首先在已知解的一些千个域上进行训练(即标记数据),然后在几百个未见过的域上预测解。物理信息PointNet(PIPN)主要是为了填补PINNs(弱监督学习模型)和完全监督学习模型之间的差距而设计的。在本文中,我们演示了PIPN如何同时预测几百个不同域的所需部分微分方程的解,同时仅使用稀疏标记数据。该框架可以在只有少量标记数据时有助于在工业中快速进行几何设计。具体而言,我们发现PIPN能够同时预测具有不同几何形状的平面应力问题的解超过500个域。此外,我们首创将显著批量大小的概念(即在每个子时期内馈入PIPN的几何数量)引入PIPN。具体而言,我们尝试了7、14、19、38、76和133的批量大小。此外,我们还研究了PIPN的规模、PIPN结构中的对称函数以及损失函数中稀疏标记数据组件的静态和动态权重的影响。