We propose a novel method for selective classification (SC), a problem which allows a classifier to abstain from predicting some instances, thus trading off accuracy against coverage (the fraction of instances predicted). In contrast to prior gating or confidence-set based work, our proposed method optimises a collection of class-wise decoupled one-sided empirical risks, and is in essence a method for explicitly finding the largest decision sets for each class that have few false positives. This one-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime, and further admits efficient implementation, leading to a flexible and principled method for SC. We theoretically derive generalization bounds for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels.
翻译:我们提出了一个选择性分类(SC)的新颖方法(SC),这个问题使分类者可以避免预测某些情况,从而在准确性与覆盖率之间作出交换(预测数的比例 ) 。 与先前的加热或基于信任的工作相比,我们提出的方法对分类法的分类法和单向脱钩的单向经验风险进行了选择,本质上是明确为每个类别找到最大决策套件的方法,而每个类别几乎没有假阳性。 这种单向的放松预测(OSP)产生了一个在实际相关的高目标精确度制度中达到接近最佳覆盖率的SC计划,并进一步承认了高效的实施,从而形成了一种灵活和有原则的SC方法。 在理论上,我们为SC和OSP得出了一般化的界限,在经验上,我们表明我们的计划在小误差级别上大大超过了艺术方法的覆盖率。