Forecasting future outcomes from recent time series data is not easy, especially when the future data are different from the past (i.e. time series are under temporal drifts). Existing approaches show limited performances under data drifts, and we identify the main reason: It takes time for a model to collect sufficient training data and adjust its parameters for complicated temporal patterns whenever the underlying dynamics change. To address this issue, we study a new approach; instead of adjusting model parameters (by continuously re-training a model on new data), we build a hypernetwork that generates other target models' parameters expected to perform well on the future data. Therefore, we can adjust the model parameters beforehand (if the hypernetwork is correct). We conduct extensive experiments with 6 target models, 6 baselines, and 4 datasets, and show that our HyperGPA outperforms other baselines.
翻译:近期时间序列数据的未来结果预测并非易事,特别是当未来数据与过去不同时(即时间序列处于时间漂移状态),现有方法显示数据漂移的性能有限,我们确定主要原因:模型需要时间收集足够的培训数据,并在基本动态变化时调整参数以适应复杂的时间模式。为了解决这一问题,我们研究一种新的方法;我们不调整模型参数(不断对新数据模型进行再培训),而是建立一个超网络,产生预期其他目标模型参数,从而在未来数据上运行良好。因此,我们可以事先调整模型参数(如果超级网络正确的话 ) 。 我们用6个目标模型、6个基线和4个数据集进行广泛的实验,并显示我们的超全球采购协议比其他基线要好。