Classical sequential models employed in time-series prediction rely on learning the mappings from the past to the future instances by way of a hidden state. The Hidden states characterise the historical information and encode the required temporal dependencies. However, most existing sequential models operate within finite-dimensional Euclidean spaces which offer limited functionality when employed in modelling physics relevant data. Alternatively recent work with neural operator learning within the Fourier space has shown efficient strategies for parameterising Partial Differential Equations (PDE). In this work, we propose a novel sequential model, built to handle Physics relevant data by way of amalgamating the conventional RNN architecture with that of the Fourier Neural Operators (FNO). The Fourier-RNN allows for learning the mappings from the input to the output as well as to the hidden state within the Fourier space associated with the temporal data. While the Fourier-RNN performs identical to the FNO when handling PDE data, it outperforms the FNO and the conventional RNN when deployed in modelling noisy, non-Markovian data.
翻译:在时间序列预测中使用的经典顺序模型依赖于通过隐蔽状态从过去到未来的测图。隐藏状态是历史信息的特点,并编码了所需的时间依赖关系。然而,大多数现有的顺序模型在有限的维度欧几里德空间内运作,这些空间在模拟物理相关数据中使用时具有有限的功能。在Fourier空间内与神经操作员学习的近期工作显示,对部分差异方位进行参数比较(PDE)的有效战略。在这项工作中,我们提出了一个新的顺序模型,通过将常规RNN结构与Fourier Neal操作员(FNO)结构相结合的方式处理物理相关数据。Fourier-RNN允许从输入到输出以及与时间数据相关的Fourier空间内的隐藏状态学习绘图。Fourier-RNN在处理PDE数据时与FNO相同,但是在模拟噪音、非Markovian数据时,它比FNO和常规RNN。