In this paper, we obtain generic bounds on the variances of estimation and prediction errors in time series analysis via an information-theoretic approach. It is seen in general that the error bounds are determined by the conditional entropy of the data point to be estimated or predicted given the side information or past observations. Additionally, we discover that in order to achieve the prediction error bounds asymptotically, the necessary and sufficient condition is that the "innovation" is asymptotically white Gaussian. When restricted to Gaussian processes and 1-step prediction, our bounds are shown to reduce to the Kolmogorov-Szeg\"o formula and Wiener-Masani formula known from linear prediction theory.
翻译:在本文中,我们通过信息理论方法获得了关于时间序列分析中估计和预测误差差异的通用界限,从总体上看,误差界限是由根据侧面信息或过去观察而估计或预测的数据点的有条件的酶值所决定的。此外,我们发现,为了实现预测误差的偶然界限,必要和充分的条件是“创新”是无症状的白色高斯。当局限于高斯进程和一步预测时,我们的界限被显示为从线性预测理论中知道的科尔莫戈夫-塞格-奥公式和维内尔-马萨尼公式。