In this paper, we present an end-to-end attention-based convolutional recurrent autoencoder (AB-CRAN) network for data-driven modeling of wave propagation phenomena. The proposed network architecture relies on the attention-based recurrent neural network (RNN) with long short-term memory (LSTM) cells. To construct the low-dimensional learning model, we employ a denoising-based convolutional autoencoder from the full-order snapshots given by time-dependent hyperbolic partial differential equations for wave propagation. To begin, we attempt to address the difficulty in evolving the low-dimensional representation in time with a plain RNN-LSTM for wave propagation phenomenon. We build an attention-based sequence-to-sequence RNN-LSTM architecture to predict the solution over a long time horizon. To demonstrate the effectiveness of the proposed learning model, we consider three benchmark problems namely one-dimensional linear convection, nonlinear viscous Burgers, and two-dimensional Saint-Venant shallow water system. Using the time-series datasets from the benchmark problems, our novel AB-CRAN architecture accurately captures the wave amplitude and preserves the wave characteristics of the solution for long time horizons. The attention-based sequence-to-sequence network increases the time-horizon of prediction by five times compared to the plain RNN-LSTM. Denoising autoencoder further reduces the mean squared error of prediction and improves the generalization capability in the parameter space.
翻译:在本文中,我们展示了一个基于端到端关注的共振常态自动编码器(AB-CRAN)网络,用于以数据驱动模式模拟波传播现象。拟议的网络结构依赖于基于关注的具有长期短期内存(LSTM)细胞的经常神经网络(RNN),以长期短期内存(LSTM)细胞为基础。为了构建低维学习模型,我们从基于时间的双曲双曲偏差部分差方程式为波传播提供的全端直线自动编码器(AB-CRAN)网络(AB-CRAN)网络。开始,我们试图用普通波传播现象的普通 RNN-LSTM 模型来解决在逐渐演变低维度代表度代表度代表度代表度时的难度。我们建立了一个基于关注的序列序列序列序列-序列-序列-RNNNNS-LSTM 结构,以预测长期内的解决办法。为了证明拟议学习模型的有效性,我们考虑三个基准问题,即一维线直线相连接、非直线IMTM Burgers 和二维基S-Vennial 浅水系系统系统。利用基准问题的时间序列数据序列数据设置,我们的新AB-CR-CR-CR-CR-CR-CR-CRVLS-CRVLA-CRS-CLS-CLS-S-CLS-CLS-s-s-CLS-S-S-CLS-S-S-S-S-S-S-CLV-S-CLVAL-S-CLV-CLVLV-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-CLV-CLVLVLVD-S-S-S-S-S-S-S-CLV-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S