While recent advances in deep learning have shown promising efficiency gains in solving time- dependent partial differential equations (PDEs), matching the accuracy of conventional numerical solvers still remains a challenge. One strategy to improve the accuracy of deep learning-based solutions for time- dependent PDEs is to use the learned solution as the coarse propagator in the Parareal method and a traditional numerical method as the fine solver. However, successful integration of deep learning into the Parareal method requires consistency between the coarse and fine solvers, particularly for PDEs exhibiting rapid changes such as sharp transitions. To ensure such consistency, we propose to use the convolutional neural networks (CNNs) to learn the fully discrete time-stepping operator defined by the traditional numerical scheme used as the fine solver. We demonstrate the effectiveness of the proposed method in solving the classical and mass-conservative Allen-Cahn (AC) equations. Through iterative updates in the Parareal algorithm, our approach achieves a significant computational speedup compared to traditional fine solvers while converging to high-accuracy solutions. Our results highlight that the pro- posed Parareal algorithm effectively accelerates simulations, particularly when implemented on multiple GPUs, and converges to the desired accuracy in only a few iterations. Another advantage of our method is that the CNNs model is trained on trajectories-based on random initial conditions, such that the trained model can be used to solve the AC equations with various initial conditions without re-training. This work demonstrates the potential of integrating neural network methods into the parallel-in-time frameworks for efficient and accurate simulations of time-dependent PDEs.
翻译:暂无翻译