KNN classification is an improvisational learning mode, in which they are carried out only when a test data is predicted that set a suitable K value and search the K nearest neighbors from the whole training sample space, referred them to the lazy part of KNN classification. This lazy part has been the bottleneck problem of applying KNN classification due to the complete search of K nearest neighbors. In this paper, a one-step computation is proposed to replace the lazy part of KNN classification. The one-step computation actually transforms the lazy part to a matrix computation as follows. Given a test data, training samples are first applied to fit the test data with the least squares loss function. And then, a relationship matrix is generated by weighting all training samples according to their influence on the test data. Finally, a group lasso is employed to perform sparse learning of the relationship matrix. In this way, setting K value and searching K nearest neighbors are both integrated to a unified computation. In addition, a new classification rule is proposed for improving the performance of one-step KNN classification. The proposed approach is experimentally evaluated, and demonstrated that the one-step KNN classification is efficient and promising
翻译:KNN 分类是一种即兴学习模式,只有在预测测试数据设定合适的 K值并从整个培训样本空间搜索最近的 K 邻居时,才进行这种分类,将他们介绍给 KNN 分类的懒惰部分。这个懒惰部分是因完全搜索 K最近的邻居而应用 KNN 分类的瓶颈问题。在本文中,建议用一步计算来取代 KNN 分类的懒惰部分。单步计算实际上将懒惰部分转换为以下矩阵计算。根据测试数据,首先应用培训样本来使测试数据与最小方块损失功能相匹配。然后,根据所有培训样本对测试数据的影响加权,生成一个关系矩阵。最后,使用一组 Lasso 来对关系矩阵进行稀疏学习。这样,设定 KNNN 值和搜索 K 最近的邻居就被整合到一个统一的计算中。此外,还提出了一个新的分类规则,以改进一步KNN 分类的性能。拟议的方法是实验性地评估,并证明一步式 KNNN 分类是有效和有希望的。