Transformers are widely used in natural language processing due to their ability to model longer-term dependencies in text. Although these models achieve state-of-the-art performance for many language related tasks, their applicability outside of the natural language processing field has been minimal. In this work, we propose the use of transformer models for the prediction of dynamical systems representative of physical phenomena. The use of Koopman based embeddings provide a unique and powerful method for projecting any dynamical system into a vector representation which can then be predicted by a transformer model. The proposed model is able to accurately predict various dynamical systems and outperform classical methods that are commonly used in the scientific machine learning literature.
翻译:自然语言处理中广泛使用变压器,原因是这些模型能够模拟文本的长期依赖性。虽然这些模型在许多与语言有关的任务中取得了最先进的性能,但在自然语言处理领域之外,其适用性微乎其微。在这项工作中,我们提议使用变压器模型来预测能动系统代表物理现象。使用以Koopman为基础的嵌入器提供了一种独特而有力的方法,将任何动态系统投射为矢量代表,然后可以通过变压器模型预测。拟议的模型能够准确预测各种动态系统和科学机器学习文献中常用的优异古典方法。