Although physics-informed neural networks(PINNs) have progressed a lot in many real applications recently, there remains problems to be further studied, such as achieving more accurate results, taking less training time, and quantifying the uncertainty of the predicted results. Recent advances in PINNs have indeed significantly improved the performance of PINNs in many aspects, but few have considered the effect of variance in the training process. In this work, we take into consideration the effect of variance and propose our VI-PINNs to give better predictions. We output two values in the final layer of the network to represent the predicted mean and variance respectively, and the latter is used to represent the uncertainty of the output. A modified negative log-likelihood loss and an auxiliary task are introduced for fast and accurate training. We perform several experiments on a wide range of different problems to highlight the advantages of our approach. The results convey that our method not only gives more accurate predictions but also converges faster.
翻译:尽管物理学知情神经网络(PINNs)最近在许多实际应用方面取得了很大进展,但仍有一些问题有待进一步研究,例如取得更准确的结果、减少培训时间和量化预测结果的不确定性等。PINNs最近的进展确实在许多方面大大改善了PINNs的表现,但很少有人考虑了培训过程中的差异效应。在这项工作中,我们考虑到差异的影响,并提议我们的VI-PNs提供更好的预测。我们在网络最后一层中分别输出两个值,以代表预测的平均值和差异,而后者则用来代表产出的不确定性。为快速和准确的培训引入了经修改的负日志类损失和辅助任务。我们进行了几个关于不同问题的实验,以突出我们方法的优点。结果表明,我们的方法不仅提供了更准确的预测,而且更快地汇合。