We show that a simple modification of the 1-nearest neighbor classifier yields a strongly Bayes consistent learner. Prior to this work, the only strongly Bayes consistent proximity-based method was the k-nearest neighbor classifier, for k growing appropriately with sample size. We will argue that a margin-regularized 1-NN enjoys considerable statistical and algorithmic advantages over the k-NN classifier. These include user-friendly finite-sample error bounds, as well as time- and memory-efficient learning and test-point evaluation algorithms with a principled speed-accuracy tradeoff. Encouraging empirical results are reported.
翻译:我们发现,简单修改最近的邻级分类法可以产生一个很强的贝耶斯一致学习者。 在这项工作之前,贝亚斯唯一最强烈的近距离一致方法就是基近距离分类法,因为K随抽样规模适当增长。我们会争辩说,比k-NN分类法,1-NN的离差值在统计和算法方面有很大的优势,其中包括方便用户的有限抽样误差界限,以及具有原则性速度-准确性权衡的具有时间和记忆效率的学习和测试点评价算法。我们报告鼓励经验结果。