We propose energy natural gradient descent, a natural gradient method with respect to a Hessian-induced Riemannian metric as an optimization algorithm for physics-informed neural networks (PINNs) and the deep Ritz method. As a main motivation we show that the update direction in function space resulting from the energy natural gradient corresponds to the Newton direction modulo an orthogonal projection onto the model's tangent space. We demonstrate experimentally that energy natural gradient descent yields highly accurate solutions with errors several orders of magnitude smaller than what is obtained when training PINNs with standard optimizers like gradient descent or Adam, even when those are allowed significantly more computation time.
翻译:我们提出能源自然梯度下降,这是赫西安引起的里曼尼指标的一种自然梯度方法,作为物理知情神经网络(PINNs)和深丽里兹方法的优化算法。作为主要动力,我们表明,由能源自然梯度产生的功能空间更新方向相当于牛顿方向摩杜洛,这是对模型正切空间的正方形投影。我们实验性地证明,能源自然梯度下降产生出一些高度准确的解决方案,误差数量小于向标准优化者(如梯度梯度下降或亚当)培训PINNs时获得的误差,即使允许这些误差明显增加计算时间。</s>