K-nearest neighbors algorithm is one of the prominent techniques used in classification and regression. Despite its simplicity, the k-nearest neighbors has been successfully applied in time series forecasting. However, the selection of the number of neighbors and feature selection is a daunting task. In this paper, we introduce two methodologies to forecasting time series that we refer to as Classical Parameters Tuning in Weighted Nearest Neighbors and Fast Parameters Tuning in Weighted Nearest Neighbors. The first approach uses classical parameters tuning that compares the most recent subsequence with every possible subsequence from the past of the same length. The second approach reduces the neighbors' search set, which leads to significantly reduced grid size and hence a lower computational time. To tune the models' parameters, both methods implement an approach inspired by cross-validation for weighted nearest neighbors. We evaluate the forecasting performance and accuracy of our models. Then, we compare them to some classical approaches, especially, Seasonal Autoregressive Integrated Moving Average, Holt-Winters and Exponential Smoothing State Space Model. Real data examples on retail and food services sales in the USA and milk production in the UK are analyzed to demonstrate the application and the efficiency of the proposed approaches.
翻译:K- 近邻算法是分类和回归过程中使用的一个突出技术。 尽管它简单, K- 近邻算法在时间序列预测中成功地应用了 K- 近邻的精密计算法。 然而, 选择邻居人数和特征选择是一项艰巨的任务。 在本文中, 我们引入了两种方法来预测时间序列, 我们称之为“ 光近近邻的经典参数图示 ” 和 “ 近近近近近近光的光学的快速参数图示 ” 。 第一个方法使用经典参数调整, 将最近的次序列与从过去同一长度的每一个可能的次序列进行比较。 第二个方法减少了邻居的搜索组, 从而导致电网规模大幅缩小, 从而降低了计算时间。 为了调整模型参数, 这两种方法都采用了一种由加权近近光近邻居交叉校验法所启发的方法。 我们评估了我们的模型的预测性能和准确性。 然后, 我们将它们与一些经典方法进行比较, 特别是, 季节性自动综合移动平均值、 霍尔特- Winters 和 Abenticteal laning Stilning Stat Sp space space modal modal modal mod mod. replace superation. revical.