We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs), with a high degree of accuracy and up to a desired tolerance. We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture. Our method leverages differentiable optimization and the implicit function theorem to effectively enforce physical constraints. Inspired by dictionary learning, our model learns a family of functions, each of which defines a mapping from PDE parameters to PDE solutions. At inference time, the model finds an optimal linear combination of the functions in the learned family by solving a PDE-constrained optimization problem. Our method provides continuous solutions over the domain of interest that accurately satisfy desired physical constraints. Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
翻译:我们提出了一种实用的方法来对由神经网络(NN)定义的函数强制实现偏微分方程(PDE)约束,精度高,达到所需的容限。我们开发了一种可微分的PDE约束层,可以集成到任何NN架构中。我们的方法利用可微优化和隐函数定理,以有效实施物理约束。受字典学习启发,我们的模型学习了一族函数,其中每个函数定义了从PDE参数到PDE解的映射。在推理时间,模型通过求解带有PDE约束的最优化问题来找到所学家族函数的最佳线性组合。我们的方法提供了在感兴趣的域上连续的解,准确满足所需的物理约束。我们的结果表明,将硬性约束直接集成到NN架构中,与在无约束目标上训练相比,可以实现更低的测试误差。