The performance of time series forecasting has recently been greatly improved by the introduction of transformers. In this paper, we propose a general multi-scale framework that can be applied to state-of-the-art transformer-based time series forecasting models including Autoformer and Informer. Using iteratively refining a forecasted time series at multiple scales with shared weights, architecture adaptations and a specially-designed normalization scheme, we are able to achieve significant performance improvements with minimal additional computational overhead. Via detailed ablation studies, we demonstrate the effectiveness of our proposed architectural and methodological innovations. Furthermore, our experiments on four public datasets show that the proposed multi-scale framework outperforms the corresponding baselines with an average improvement of 13% and 38% over Autoformer and Informer, respectively.
翻译:采用变压器最近大大提高了时间序列预测的性能。 在本文中,我们提出了一个通用的多尺度框架,可以适用于最新变压器基于时间序列的预测模型,包括自动变压器和内向式预测模型。我们通过迭代地完善多尺度的预测时间序列,同时共享权重、结构调整和专门设计的正常化计划,能够以最低限度的额外计算间接费用实现显著的性能改进。我们进行了详细的减缩研究,展示了我们拟议的建筑和方法创新的有效性。此外,我们对四个公共数据集的实验显示,拟议的多尺度框架比自动变压器和内向式预测的基线分别高出13%和38%。