Transformers have achieved superior performances in many tasks in natural language processing and computer vision, which also triggered great interest in the time series community. Among multiple advantages of Transformers, the ability to capture long-range dependencies and interactions is especially attractive for time series modeling, leading to exciting progress in various time series applications. In this paper, we systematically review Transformer schemes for time series modeling by highlighting their strengths as well as limitations. In particular, we examine the development of time series Transformers in two perspectives. From the perspective of network structure, we summarize the adaptations and modifications that have been made to Transformers in order to accommodate the challenges in time series analysis. From the perspective of applications, we categorize time series Transformers based on common tasks including forecasting, anomaly detection, and classification. Empirically, we perform robust analysis, model size analysis, and seasonal-trend decomposition analysis to study how Transformers perform in time series. Finally, we discuss and suggest future directions to provide useful research guidance. A corresponding resource that has been continuously updated can be found in the GitHub repository. To the best of our knowledge, this paper is the first work to comprehensively and systematically summarize the recent advances of Transformers for modeling time series data. We hope this survey will ignite further research interests in time series Transformers.
翻译:在自然语言处理和计算机视觉的许多任务中,变异器取得了优异的绩效,这也引起了对时间序列界的极大兴趣。在变异器的多种优势中,捕捉长距离依赖和相互作用的能力对于时间序列模型的建模特别有吸引力,导致不同时间序列应用的令人兴奋的进展。在本文件中,我们系统地审查变异器的时间序列模型计划,突出其长处和局限性。特别是,我们从两个角度审视时间序列变异器的开发情况。从网络结构的角度来看,我们总结了为适应时间序列分析的挑战而对变异器所作的调整和修改。从应用的角度来看,我们根据预测、异常检测和分类等共同任务对时间序列变异器进行分类特别有吸引力。我们进行强有力的分析,模型大小分析,以及季节-趋势分解剖分析,以研究变异器在时间序列中的表现。最后,我们从两个角度讨论并提出未来方向,以提供有用的研究指导。我们从网络结构的角度来看,可以找到不断更新的对变异器的对应资源。从我们所了解的最佳的时序中,这是最新时间序列进展的模型,这是我们最新研究系列研究的首项。我们将全面总结的系列研究。