Extending the forecasting time is a critical demand for real applications, such as extreme weather early warning and long-term energy consumption planning. This paper studies the long-term forecasting problem of time series. Prior Transformer-based models adopt various self-attention mechanisms to discover the long-range dependencies. However, intricate temporal patterns of the long-term future prohibit the model from finding reliable dependencies. Also, Transformers have to adopt the sparse versions of point-wise self-attentions for long series efficiency, resulting in the information utilization bottleneck. Going beyond Transformers, we design Autoformer as a novel decomposition architecture with an Auto-Correlation mechanism. We break with the pre-processing convention of series decomposition and renovate it as a basic inner block of deep models. This design empowers Autoformer with progressive decomposition capacities for complex time series. Further, inspired by the stochastic process theory, we design the Auto-Correlation mechanism based on the series periodicity, which conducts the dependencies discovery and representation aggregation at the sub-series level. Auto-Correlation outperforms self-attention in both efficiency and accuracy. In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a 38% relative improvement on six benchmarks, covering five practical applications: energy, traffic, economics, weather and disease. Code is available at this repository: \url{https://github.com/thuml/Autoformer}.
翻译:延长预测时间是真正应用的关键需求,例如极端天气预警和长期能源消耗规划。本文研究时间序列的长期预测问题。 先前的变异模型采用各种自留机制发现远程依赖性。 然而,长期未来的复杂时间模式使模型无法找到可靠的依赖性。 此外,变异器必须采用稀缺版本的近距离自留自控,从而产生信息利用瓶颈。 超越变异器,我们设计自动变异器,作为具有自动校正机制的新型变异结构。 我们打破了一系列变异预处理前会议,并把它作为深层模型的基本内部块进行翻新。 但是,长期的变异模式使得模型无法找到可靠的依赖性。 此外,在变异程序理论的启发下,我们不得不根据一系列周期设计自动校正机制,在5级一级进行依赖性发现和代表性汇总。 自动校正的变异结构超越了自动变异结构, 在可变现的天气序列中, 自动变异性流量的自我预测, 以及长期的精确性和精确性。