This paper studies least-squares ReLU neural network method for solving the linear advection-reaction problem with discontinuous solution. The method is a discretization of an equivalent least-squares formulation in the set of neural network functions with the ReLU activation function. The method is capable of approximating the discontinuous interface of the underlying problem automatically through the free hyper-planes of the ReLU neural network and, hence, outperforms mesh-based numerical methods in terms of the number of degrees of freedom. Numerical results of some benchmark test problems show that the method can not only approximate the solution with the least number of parameters, but also avoid the common Gibbs phenomena along the discontinuous interface. Moreover, a three-layer ReLU neural network is necessary and sufficient in order to well approximate a discontinuous solution with an interface in $\mathbb{R}^2$ that is not a straight line.
翻译:本文研究用不连续的解决方案解决线性平反反应问题的最不平方 ReLU 神经网络方法。 该方法是一种与RELU 激活功能的神经网络函数组中等同的最平方配方的离散方法。 该方法能够通过 ReLU 神经网络的免费高空天平自动接近根本问题的不连续界面,从而在自由度方面优于基于网状的数字方法。 一些基准测试问题的数字结果表明,该方法不仅可以以最小的参数来近似该解决方案,而且还可以避免在不连续界面上常见的吉布斯现象。 此外,三层 ReLU 神经网络是必要的,而且足够,可以与不是直线的 $\mathb{R ⁇ 2$的界面相近一个不连续的解决方案。