Recent studies have shown that deep learning models such as RNNs and Transformers have brought significant performance gains for long-term forecasting of time series because they effectively utilize historical information. We found, however, that there is still great room for improvement in how to preserve historical information in neural networks while avoiding overfitting to noise presented in the history. Addressing this allows better utilization of the capabilities of deep learning models. To this end, we design a \textbf{F}requency \textbf{i}mproved \textbf{L}egendre \textbf{M}emory model, or {\bf FiLM}: it applies Legendre Polynomials projections to approximate historical information, uses Fourier projection to remove noise, and adds a low-rank approximation to speed up computation. Our empirical studies show that the proposed FiLM significantly improves the accuracy of state-of-the-art models in multivariate and univariate long-term forecasting by (\textbf{20.3\%}, \textbf{22.6\%}), respectively. We also demonstrate that the representation module developed in this work can be used as a general plug-in to improve the long-term prediction performance of other deep learning modules. Code will be released soon.
翻译:最近的研究显示,诸如RNN和变异器等深层学习模型为时间序列的长期预测带来了显著的绩效收益,因为它们有效利用了历史信息。 但我们发现,在如何保存神经网络的历史信息方面仍有很大的改进余地,同时避免过度适应历史所呈现的噪音。 解决这个问题,可以更好地利用深层学习模型的能力。 为此,我们设计了一种深层学习模型( textbf{F}F}F}rquation {textb{i}}L}egendre\ textbf{M}emory 模型, 或\bf FILM} :它将图例性聚合模型的预测应用于近似历史信息,使用 Fourier 预测来消除噪音,并增加低级的近似度来加速计算。 我们的经验研究表明, 拟议的 FILM 将大大提高多变量和未变异性长期状态模型的准确性能(\ textf{20.3{{{\\ textb{M}M}Mymory imorymorial moal model model 模型) 。 我们还将用这个长期的模型来改进其他学习模型。