Empirical risk minimization is a standard principle for choosing algorithms in learning theory. In this paper we study the properties of empirical risk minimization for time series. The analysis is carried out in a general framework that covers different types of forecasting applications encountered in the literature. We are concerned with 1-step-ahead prediction of a univariate time series generated by a parameter-driven process. A class of recursive algorithms is available to forecast the time series. The algorithms are recursive in the sense that the forecast produced in a given period is a function of the lagged values of the forecast and of the time series. The relationship between the generating mechanism of the time series and the class of algorithms is unspecified. Our main result establishes that the algorithm chosen by empirical risk minimization achieves asymptotically the optimal predictive performance that is attainable within the class of algorithms.
翻译:经验风险最小化是学习理论中选择算法的一项标准原则。 在本文中,我们研究了时间序列实验风险最小化的特性。 分析是在涵盖文献中遇到的不同类型预测应用的一般框架内进行的。 我们关心的是对参数驱动过程产生的单轨时间序列的一步前预测。 有一类循环算法可用于预测时间序列。 这些算法是循环的,因为特定期间产生的预测是预测和时间序列滞后值的函数。 时间序列生成机制与算法类别之间的关系没有说明。 我们的主要结果证明,通过实验风险最小化选择的算法在逻辑类别中可以实现的最佳预测性表现是随机的。