Graph-based next-step prediction models have recently been very successful in modeling complex high-dimensional physical systems on irregular meshes. However, due to their short temporal attention span, these models suffer from error accumulation and drift. In this paper, we propose a new method that captures long-term dependencies through a transformer-style temporal attention model. We introduce an encoder-decoder structure to summarize features and create a compact mesh representation of the system state, to allow the temporal model to operate on a low-dimensional mesh representations in a memory efficient manner. Our method outperforms a competitive GNN baseline on several complex fluid dynamics prediction tasks, from sonic shocks to vascular flow. We demonstrate stable rollouts without the need for training noise and show perfectly phase-stable predictions even for very long sequences. More broadly, we believe our approach paves the way to bringing the benefits of attention-based sequence models to solving high-dimensional complex physics tasks.
翻译:基于图形的下一步预测模型最近非常成功地建模了对非正常的藻类的复杂高维物理系统。 但是,由于它们的时间间隔较短,这些模型受到错误积累和漂移的影响。 在本文中,我们提出一种新的方法,通过变压器式的时光注意模型来捕捉长期依赖性。我们引入了一个编码器解码器结构来总结系统状态的特征并创建一个紧凑的网格表示器,使时间模型能够以记忆效率的方式运行在低维网格表示器上。我们的方法在从声震到血管流动等若干复杂的流体动态预测任务上超过了一个竞争性的GNN基线。我们展示了稳定的扩展,而无需培养噪音,并显示即使在非常长的顺序上也完全可分阶段预测。更广义地说,我们认为我们的方法为将关注序列模型的好处带给高维的复杂物理学任务铺平了道路。