While classical time series forecasting considers individual time series in isolation, recent advances based on deep learning showed that jointly learning from a large pool of related time series can boost the forecasting accuracy. However, the accuracy of these methods suffers greatly when modeling out-of-sample time series, significantly limiting their applicability compared to classical forecasting methods. To bridge this gap, we adopt a meta-learning view of the time series forecasting problem. We introduce a novel forecasting method, called Meta Global-Local Auto-Regression (Meta-GLAR), that adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts. Crucially, the parameters ofthe RNN are learned across multiple time series by backpropagating through the closed-form adaptation mechanism. In our extensive empirical evaluation we show that our method is competitive with the state-of-the-art in out-of-sample forecasting accuracy reported in earlier work.
翻译:虽然传统时间序列预测孤立地考虑个别时间序列,但最近基于深层学习的进展表明,从大量相关时间序列中共同学习可以提高预测的准确性;然而,当模拟不模拟的时间序列时,这些方法的准确性受到极大影响,与古典预测方法相比,这些方法的适用性受到极大限制。为缩小这一差距,我们采用了时间序列预测问题元学习观点。我们引入了一种新型预测方法,称为Meta Global-Lobal AutoRegresion(Meta-GLAR),通过从经常神经网络(RNN)制作的演示图解到一步前预报,对每个时间序列都进行了封闭式的调整。关键是,通过封闭式适应机制反向调整来学习了RNN的参数。我们的广泛经验评估表明,我们的方法与早期工作中报告的抽样预测准确性最新数据具有竞争力。