Neural Networks have been widely used to solve Partial Differential Equations. These methods require to approximate definite integrals using quadrature rules. Here, we illustrate via 1D numerical examples the quadrature problems that may arise in these applications and propose different alternatives to overcome them, namely: Monte Carlo methods, adaptive integration, polynomial approximations of the Neural Network output, and the inclusion of regularization terms in the loss. We also discuss the advantages and limitations of each proposed alternative. We advocate the use of Monte Carlo methods for high dimensions (above 3 or 4), and adaptive integration or polynomial approximations for low dimensions (3 or below). The use of regularization terms is a mathematically elegant alternative that is valid for any spacial dimension, however, it requires certain regularity assumptions on the solution and complex mathematical analysis when dealing with sophisticated Neural Networks.
翻译:神经网络已被广泛用于解决部分差异等分问题。 这些方法需要使用二次曲线规则来对确定的组成部分进行近似。 这里, 我们通过 1D 数字示例来说明这些应用中可能出现的二次曲线问题,并提出克服这些问题的不同替代方法, 即: 蒙特卡洛 方法、 适应性整合、 神经网络产出的多元近似值 和 将正规化条件纳入损失中 。 我们还讨论了每种拟议替代方法的优点和局限性 。 我们主张使用蒙特卡洛 方法处理高尺寸( 3 或 4 以上), 以及适应性整合或低尺寸( 3 或 3 以上 ) 的多元近似值 。 使用正规化术语是一种数学优异的替代方法, 对任何和平层面都有效, 但是, 它要求在处理复杂的神经网络时对解决方案进行某些常规假设和复杂的数学分析 。