Time-dependent partial differential equations (PDEs) for classic physical systems are established based on the conservation of mass, momentum, and energy, which are ubiquitous in scientific and engineering applications. These PDEs are strictly local-dependent according to the principle of locality in physics, which means that the evolution at a point is only influenced by the neighborhood around it whose size is determined by the length of timestep multiplied with the speed of characteristic information traveling in the system. However, deep learning architecture cannot strictly enforce the local-dependency as it inevitably increases the scope of information to make local predictions as the number of layers increases. Under limited training data, the extra irrelevant information results in sluggish convergence and compromised generalizability. This paper aims to solve this problem by proposing a data decomposition method to strictly limit the scope of information for neural operators making local predictions, which is called data decomposition enforcing local-dependency (DDELD). The numerical experiments over multiple physical phenomena show that DDELD significantly accelerates training convergence and reduces test errors of benchmark models on large-scale engineering simulations.
翻译:暂无翻译