Multivariate time series forecasting has seen widely ranging applications in various domains, including finance, traffic, energy, and healthcare. To capture the sophisticated temporal patterns, plenty of research studies designed complex neural network architectures based on many variants of RNNs, GNNs, and Transformers. However, complex models are often computationally expensive and thus face a severe challenge in training and inference efficiency when applied to large-scale real-world datasets. In this paper, we introduce LightTS, a light deep learning architecture merely based on simple MLP-based structures. The key idea of LightTS is to apply an MLP-based structure on top of two delicate down-sampling strategies, including interval sampling and continuous sampling, inspired by a crucial fact that down-sampling time series often preserves the majority of its information. We conduct extensive experiments on eight widely used benchmark datasets. Compared with the existing state-of-the-art methods, LightTS demonstrates better performance on five of them and comparable performance on the rest. Moreover, LightTS is highly efficient. It uses less than 5% FLOPS compared with previous SOTA methods on the largest benchmark dataset. In addition, LightTS is robust and has a much smaller variance in forecasting accuracy than previous SOTA methods in long sequence forecasting tasks.
翻译:多变时间序列预测在金融、交通、能源和医疗保健等不同领域的应用范围很广。为了捕捉复杂的时间模式,大量研究根据许多区域NN、GNN和变异器的多种变体设计了复杂的神经网络结构。然而,复杂的模型往往计算昂贵,因此在应用于大规模现实世界数据集时,在培训和推断效率方面面临着严峻的挑战。在本文件中,我们引入了光TSTS,这是一个光深学习结构,仅以简单的 MLP 结构为基础。光深学习结构的关键思想是,在两种微妙的下取样战略的顶部应用以MLP为基础的结构,包括间隔取样和连续取样,其灵感来自一个关键的事实,即下标的时间序列往往保存其大部分信息。我们在八个广泛使用的基准数据集上进行了广泛的实验。与现有的最新方法相比,光深显示其中五个数据库的绩效和其余部分的可比较性业绩。此外,光TS的关键思想是,在两个微妙的下取样战略中,使用不到5%的FLOPS,在前一个最强的预测序列中,在最强的SOTA中,在最精确的预测方法中,在前一个最强的精确的SOTA中,在最精确的预测方法上是较强的。