We introduce a new low-noise condition for classification, the Model Margin Noise (MM noise) assumption, and derive enhanced $\mathcal{H}$-consistency bounds under this condition. MM noise is weaker than Tsybakov noise condition: it is implied by Tsybakov noise condition but can hold even when Tsybakov fails, because it depends on the discrepancy between a given hypothesis and the Bayes-classifier rather than on the intrinsic distributional minimal margin (see Figure 1 for an illustration of an explicit example). This hypothesis-dependent assumption yields enhanced $\mathcal{H}$-consistency bounds for both binary and multi-class classification. Our results extend the enhanced $\mathcal{H}$-consistency bounds of Mao, Mohri, and Zhong (2025a) with the same favorable exponents but under a weaker assumption than the Tsybakov noise condition; they interpolate smoothly between linear and square-root regimes for intermediate noise levels. We also instantiate these bounds for common surrogate loss families and provide illustrative tables.
翻译:本文针对分类问题提出了一种新的低噪声条件——模型边界噪声(MM噪声)假设,并在此条件下推导出增强的$\mathcal{H}$-一致性界。MM噪声条件弱于Tsybakov噪声条件:它可由Tsybakov噪声条件导出,但在Tsybakov条件不成立时仍可能满足,因为该条件依赖于给定假设与贝叶斯分类器之间的差异,而非内在分布的最小边界(具体示例可见图1)。这种假设依赖性的条件为二分类与多分类问题均提供了增强的$\mathcal{H}$-一致性界。我们的研究拓展了Mao、Mohri与Zhong(2025a)的增强$\mathcal{H}$-一致性界,在保持相同优势指数的前提下,采用了比Tsybakov噪声条件更弱的假设;该结果在中等噪声水平下实现了线性与平方根机制间的平滑过渡。我们还将这些边界实例化于常见的替代损失函数族,并提供了示例表格。