Physics-informed Neural Networks (PINNs) are gaining attention in the engineering and scientific literature for solving a range of differential equations with applications in weather modeling, healthcare, manufacturing, etc. Poor scalability is one of the barriers to utilizing PINNs for many real-world problems. To address this, a Self-scalable tanh (Stan) activation function is proposed for the PINNs. The proposed Stan function is smooth, non-saturating, and has a trainable parameter. During training, it can allow easy flow of gradients to compute the required derivatives and also enable systematic scaling of the input-output mapping. It is shown theoretically that the PINNs with the proposed Stan function have no spurious stationary points when using gradient descent algorithms. The proposed Stan is tested on a number of numerical studies involving general regression problems. It is subsequently used for solving multiple forward problems, which involve second-order derivatives and multiple dimensions, and an inverse problem where the thermal diffusivity of a rod is predicted with heat conduction data. These case studies establish empirically that the Stan activation function can achieve better training and more accurate predictions than the existing activation functions in the literature.
翻译:工程和科学文献中日益重视物理知情神经网络(PINNs),以解决一系列不同方程式的工程和科学文献,这些方程式在天气模型、医疗、制造等方面应用了气候模型、医疗、制造等等。 差幅缩放性是使用PINNs解决许多现实世界问题的障碍之一。 为了解决这个问题,为PINNs提议了一个可自我缩放的坦(斯坦)激活功能。 拟议的斯坦功能是光滑的、不饱和的,并有一个可训练的参数。 在培训期间,它允许易变梯子流动,以计算所需的衍生物,并能够系统地扩大输入-输出绘图。 从理论上看,使用渐变的根程算法,拟议斯坦的PINNs在使用PINNs时没有虚假的固定点。 提议的斯坦在涉及一般回归问题的若干数字研究中进行了测试。 随后,它被用于解决多个前方问题,涉及二阶衍生物和多个维度,以及一个反面问题,即用热导数据预测棒的热阻断度。这些案例研究证明, Stan 激活功能比现有文献更精确的功能更能。