The distribution shift in Time Series Forecasting (TSF), indicating series distribution changes over time, largely hinders the performance of TSF models. Existing works towards distribution shift in time series are mostly limited in the quantification of distribution and, more importantly, overlook the potential shift between lookback and horizon windows. To address above challenges, we systematically summarize the distribution shift in TSF into two categories. Regarding lookback windows as input-space and horizon windows as output-space, there exist (i) intra-space shift, that the distribution within the input-space keeps shifted over time, and (ii) inter-space shift, that the distribution is shifted between input-space and output-space. Then we introduce, Dish-TS, a general neural paradigm for alleviating distribution shift in TSF. Specifically, for better distribution estimation, we propose the coefficient net (CONET), which can be any neural architectures, to map input sequences into learnable distribution coefficients. To relieve intra-space and inter-space shift, we organize Dish-TS as a Dual-CONET framework to separately learn the distribution of input- and output-space, which naturally captures the distribution difference of two spaces. In addition, we introduce a more effective training strategy for intractable CONET learning. Finally, we conduct extensive experiments on several datasets coupled with different state-of-the-art forecasting models. Experimental results show Dish-TS consistently boosts them with a more than 20% average improvement. Code is available.
翻译:时间序列预测(TSF)的分布变化表明时间序列分布随时间而变化,这在很大程度上阻碍了TSF模式的绩效。时间序列分配变化的现有工作大多在分配量化方面有限,更重要的是,忽略了回向窗口和地平线窗口之间的潜在转变。为了应对上述挑战,我们系统地将TSF的分布变化归纳成两类。关于作为输入空间和地平线窗口的回溯窗口作为输入空间和视野窗口作为输出空间的分布变化,存在(一)空间内部变化,投入空间和空间内部的分布随时间变化而变化,以及(二)空间内部变化,在输入空间和产出空间之间的分配转移。随后我们引入了Dish-TS,这是缓解 TSF分配变化的一般神经模式。具体地说,为了更好的分配估计,我们建议系数网(CONET),它可以是任何神经结构,将输入序列序列映射成可传播的分布系数。为了缓解空间和空间内部空间之间的变化,我们将Dish-TS作为双子网络框架,以单独学习投入空间和产出空间的分布。我们自然地平坦地掌握了稳定战略差异。我们最后学习了CON两个空间的演算。我们学习了两个空间之间的演化了两个空间之间的演算。我们不断演化了两个空间的演算。</s>