Deep probabilistic time series forecasting has gained significant attention due to its superior performance in nonlinear approximation and its ability to provide valuable uncertainty quantification for decision-making tasks. However, many existing models oversimplify the problem by assuming that the error process is time-independent, thereby overlooking the serial correlation in the error process. To overcome this limitation, we propose an innovative training method that incorporates error autocorrelation to further enhance the accuracy of probabilistic forecasting. Our method involves constructing a mini-batch as a collection of $D$ consecutive time series segments for model training and explicitly learning a time-varying covariance matrix over each mini-batch that encodes the error correlation among adjacent time steps. The learned covariance matrix can be used to improve prediction accuracy and enhance uncertainty quantification. We evaluate our method on two different neural forecasting models and multiple public datasets, and the experimental results confirm the effectiveness of the proposed approach in enhancing the performance of both models across a wide range of datasets, yielding notable improvements in predictive accuracy.
翻译:暂无翻译