When data is of an extraordinarily large size or physically stored in different locations, the distributed nearest neighbor (NN) classifier is an attractive tool for classification. We propose a novel distributed adaptive NN classifier for which the number of nearest neighbors is a tuning parameter stochastically chosen by a data-driven criterion. An early stopping rule is proposed when searching for the optimal tuning parameter, which not only speeds up the computation but also improves the finite sample performance of the proposed Algorithm. Convergence rate of excess risk of the distributed adaptive NN classifier is investigated under various sub-sample size compositions. In particular, we show that when the sub-sample sizes are sufficiently large, the proposed classifier achieves the nearly optimal convergence rate. Effectiveness of the proposed approach is demonstrated through simulation studies as well as an empirical application to a real-world dataset.
翻译:当数据非常大或实际储存在不同地点时,分布最近的邻居(NN)分类器是一种有吸引力的分类工具。我们提议了一个新的分布式适应性NN分类器,其最近的邻居数目是数据驱动的标准所选调参数。在寻找最佳调试参数时,提出了早期停止规则,该参数不仅加快了计算速度,而且提高了拟议的Algorithm的有限样本性能。分布式适应性NNN分类器超风险的趋同率在各种子抽样结构下进行调查。特别是,我们表明,当子抽样大小足够大时,拟议的分类器达到几乎最佳的趋同率。通过模拟研究以及对真实世界数据集的经验应用,可以证明拟议方法的有效性。