Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning. Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism. Though one could lower the complexity of Transformers by inducing the sparsity in point-wise self-attentions for LTTF, the limited information utilization prohibits the model from exploring the complex dependencies comprehensively. To this end, we propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects: (i) an encoder-decoder architecture incorporating a linear complexity without sacrificing information utilization is proposed on top of sliding-window attention and Stationary and Instant Recurrent Network (SIRN); (ii) a module derived from the normalizing flow is devised to further improve the information utilization by inferring the outputs with the latent variables in SIRN directly; (iii) the inter-series correlation and temporal dynamics in time-series data are modeled explicitly to fuel the downstream self-attention mechanism. Extensive experiments on seven real-world datasets demonstrate that Conformer outperforms the state-of-the-art methods on LTTF and generates reliable prediction results with uncertainty quantification.
翻译:长期时间序列预测(LTTF)已成为许多应用,如风力供电规划等许多应用的紧迫需求。由于计算自省机制高,采用变异模型来提供高预测能力。尽管人们可以通过引导TTF的点心自省偏颇而降低变异器的复杂性,但信息利用率有限,使模型无法全面探索复杂的依赖性。为此,我们提议了一个高效的变异器模型,名为Confreed,它与LTF现有方法在三个方面有所区别:(一) 在滑动窗口注意力和固定和经常网络(SIRN)的顶端,提议采用包含线性复杂性的编码脱coder结构,但不牺牲信息利用;(二) 设计一个从流动正常化中得出的模块,通过直接用SIRN的潜在变量来推断产出来进一步改进信息利用情况。 (三) 时间序列数据中的序列相互联系和时间动态,明确以模型为下游自留机制提供燃料。关于7个实体-窗口和静止经常网络(SIRN)数据的大规模实验,显示真实的预测结果。