KNN classification is a query triggered yet improvisational learning mode, in which they are carried out only when a test data is predicted that set a suitable K value and search the K nearest neighbors from the whole training sample space, referred them to the lazy part of KNN classification. This lazy part has been the bottleneck problem of applying KNN classification. In this paper, a one-step computation is proposed to replace the lazy part of KNN classification. The one-step computation actually transforms the lazy part to a matrix computation as follows. Given a test data, training samples are first applied to fit the test data with the least squares loss function. And then, a relationship matrix is generated by weighting all training samples according to their influence on the test data. Finally, a group lasso is employed to perform sparse learning of the relationship matrix. In this way, setting K value and searching K nearest neighbors are both integrated to a unified computation. In addition, a new classification rule is proposed for improving the performance of one-step KNN classification. The proposed approach is experimentally evaluated, and demonstrated that the one-step KNN classification is efficient and promising.
翻译:KNN 分类是一种触发但即兴学习模式的查询,在这种模式中,只有在预测测试数据确定合适的 K值并从整个培训样本空间搜索最近的 K 邻居时,才进行这些测试数据,将他们介绍给 KNN 分类的懒惰部分。这个懒惰部分是应用 KNN 分类的瓶颈问题。在本文中,建议用一步计算来取代 KNN 分类的懒惰部分。一步计算实际上将懒惰部分转换成如下矩阵计算。根据测试数据,首先应用培训样本来使测试数据与最小的方块损失功能相匹配。然后,根据所有培训样本对测试数据的影响加权,从而生成了一个关系矩阵。最后,组 Lasso 被用来对关系矩阵进行稀疏学习。在这种方式中,设定 K值和搜索 K 最近的邻居都与统一计算。此外,还提出了一个新的分类规则来改进一步 KNN 分类的性能。提议的方法是实验性地评估,并表明一步 KNN 分类是有效和有希望的。