We extend a recently proposed 1-nearest-neighbor based multiclass learning algorithm and prove that our modification is universally strongly Bayes-consistent in all metric spaces admitting any such learner, making it an "optimistically universal" Bayes-consistent learner. This is the first learning algorithm known to enjoy this property; by comparison, the $k$-NN classifier and its variants are not generally universally Bayes-consistent, except under additional structural assumptions, such as an inner product, a norm, finite dimension, or a Besicovitch-type property. The metric spaces in which universal Bayes consistency is possible are the "essentially separable" ones -- a notion that we define, which is more general than standard separability. The existence of metric spaces that are not essentially separable is widely believed to be independent of the ZFC axioms of set theory. We prove that essential separability exactly characterizes the existence of a universal Bayes-consistent learner for the given metric space. In particular, this yields the first impossibility result for universal Bayes consistency. Taken together, our results completely characterize strong and weak universal Bayes consistency in metric spaces.
翻译:我们推广了最近提出的以近邻为主的多级学习算法,并证明我们所作的修改在所有衡量空间都普遍强烈地一致接受任何这样的学习者,使其成为第一个已知享有这种属性的“乐观普遍”的学习算法;相比之下,美元-NNE分类器及其变种一般不是普遍一致的海湾,除非根据额外的结构假设,如内部产品、规范、有限维度或贝西科维奇类型的属性。通用贝耶斯一致性有可能成为“基本可分离性”的衡量空间,这是我们定义的一种概念,比标准可分离性更普遍。人们普遍认为,基本上不具有分离性的计量空间的存在是独立于ZFC定理论的氧化物的。我们证明,关键分离性正是在特定计量空间中存在一个通用的贝耶斯兼容性学习者。特别是,这给普遍贝耶斯一致性带来了第一个无法实现的“基本一致性” 。