The problem of solving partial differential equations (PDEs) can be formulated into a least-squares minimization problem, where neural networks are used to parametrize PDE solutions. A global minimizer corresponds to a neural network that solves the given PDE. In this paper, we show that the gradient descent method can identify a global minimizer of the least-squares optimization for solving second-order linear PDEs with two-layer neural networks under the assumption of over-parametrization. We also analyze the generalization error of the least-squares optimization for second-order linear PDEs and two-layer neural networks, when the right-hand-side function of the PDE is in a Barron-type space and the least-squares optimization is regularized with a Barron-type norm, without the over-parametrization assumption.
翻译:解决部分偏差方程(PDEs)的问题可以发展成一个最小方程最小化问题,即使用神经网络来平衡PDE解决方案。全球最小化器相当于解决给定的PDE的神经网络。在本文中,我们表明,梯度下降法可以确定一个全球最小方程优化的最小方程优化,用于在假设高度平衡的情况下用双层神经网络解决二级线性PDEs。我们还分析了二阶线型PDEs和双层神经网络最差方程优化的普遍化错误,而PDE的右侧功能位于Barron型空间,而最小方位优化则与Barron型规范正规化,而没有过度平衡假设。