The property of almost every point being a Lebesgue point has proven to be crucial for the consistency of several classification algorithms based on nearest neighbors. We characterize Lebesgue points in terms of a 1-Nearest Neighbor regression algorithm for pointwise estimation, fleshing out the role played by tie-breaking rules in the corresponding convergence problem. We then give an application of our results, proving the convergence of the risk of a large class of 1-Nearest Neighbor classification algorithms in general metric spaces where almost every point is a Lebesgue point.
翻译:几乎每一个点的属性都是Lebesgue 点, 这对于基于近邻的若干分类算法的一致性至关重要 。 我们用最近邻的1比2回归算法来描述Lebesgue 点的特征, 以便进行有分寸的估计, 并充实了在相应的趋同问题中断开规则的作用 。 然后我们运用了我们的结果, 证明在一般的公制空间里, 几乎每个点都属于Lebesgue 点的一比1最远邻的分类算法的风险是趋同的 。