Transformers have achieved superior performances in many tasks in natural language processing and computer vision, which also intrigues great interests in the time series community. Among multiple advantages of transformers, the ability to capture long-range dependencies and interactions is especially attractive for time series modeling, leading to exciting progress in various time series applications. In this paper, we systematically review transformer schemes for time series modeling by highlighting their strengths as well as limitations through a new taxonomy to summarize existing time series transformers in two perspectives. From the perspective of network modifications, we summarize the adaptations of module level and architecture level of the time series transformers. From the perspective of applications, we categorize time series transformers based on common tasks including forecasting, anomaly detection, and classification. Empirically, we perform robust analysis, model size analysis, and seasonal-trend decomposition analysis to study how Transformers perform in time series. Finally, we discuss and suggest future directions to provide useful research guidance. To the best of our knowledge, this paper is the first work to comprehensively and systematically summarize the recent advances of Transformers for modeling time series data. We hope this survey will ignite further research interests in time series Transformers.
翻译:在自然语言处理和计算机视觉的许多任务中,变异器取得了优异的性能,这也引起了对时间序列界的极大兴趣。在变异器的多种优势中,捕捉长距离依赖和相互作用的能力对于时间序列模型的建模特别有吸引力,导致各种时间序列应用的令人兴奋的进展。在本文件中,我们系统地审查时间序列模型变异器的变异器计划,通过从两个角度突出其长处和局限性来总结现有的时间序列变异器。从网络修改的角度来看,我们总结了时间序列变异器模块水平和结构水平的调整情况。从应用的角度来看,我们根据共同的任务对时间序列变异器进行分类,包括预报、异常检测和分类。我们进行稳健的分析、模型大小分析以及季节-趋势分析,以研究变异器在时间序列中的表现。最后,我们讨论并提出未来的方向,以提供有用的研究指导。根据我们的知识,本文是全面、系统地总结变异变器在模拟时间序列数据方面的最新进展的首项工作。我们希望,这次调查将进一步激发时间序列的研究利益。