Using neural networks to solve variational problems, and other scientific machine learning tasks, has been limited by a lack of consistency and an inability to exactly integrate expressions involving neural network architectures. We address these limitations by formulating a novel neural network architecture that combines a polynomial mixture-of-experts model with free knot B1-spline basis functions. Effectively, our architecture performs piecewise polynomial approximation on each cell of a trainable partition of unity. Our architecture exhibits both $h$- and $p$- refinement for regression problems at the convergence rates expected from approximation theory, allowing for consistency in solving variational problems. Moreover, this architecture, its moments, and its partial derivatives can all be integrated exactly, obviating a reliance on sampling or quadrature and enabling error-free computation of variational forms. We demonstrate the success of our network on a range of regression and variational problems that illustrate the consistency and exact integrability of our network architecture.
翻译:利用神经网络解决变异问题和其他科学机器学习任务,由于缺乏一致性和无法将神经网络结构的表达方式完全融合在一起,因此受到限制。我们通过开发一个新的神经网络结构,将多种专家混合模型与免费结结 B1-spline 基本功能结合起来,解决这些局限性。我们的建筑有效地在可训练的统一分区的每个单元格上都进行小巧的多米近似。我们的建筑展示了以近似理论预期的趋同率对回归问题进行美化,从而得以在解决变异问题方面保持一致。此外,这一结构、其瞬间及其部分衍生物都可以完全融合,避免依赖抽样或四面形结构,并允许对变异形式进行无误的计算。我们展示了我们的网络在一系列回归和变异问题上的成功,这显示了我们网络结构的一致性和精确性。