Conformal Predictors (CP) are wrappers around ML methods, providing error guarantees under weak assumptions on the data distribution. They are suitable for a wide range of problems, from classification and regression to anomaly detection. Unfortunately, their high computational complexity limits their applicability to large datasets. In this work, we show that it is possible to speed up a CP classifier considerably, by studying it in conjunction with the underlying ML method, and by exploiting incremental&decremental learning. For methods such as k-NN, KDE, and kernel LS-SVM, our approach reduces the running time by one order of magnitude, whilst producing exact solutions. With similar ideas, we also achieve a linear speed up for the harder case of bootstrapping. Finally, we extend these techniques to improve upon an optimization of k-NN CP for regression. We evaluate our findings empirically, and discuss when methods are suitable for CP optimization.
翻译:圆形预报员(CP)环绕 ML 方法,在数据分布的薄弱假设下提供错误保证。 它们适合于从分类和回归到异常检测等一系列广泛的问题。 不幸的是,它们的计算复杂性高限制了其对大型数据集的适用性。 在这项工作中,我们表明,可以通过结合基本 ML 方法来研究CP 分类器,并利用递增和递减学习来大大加快这种分类器。 对于 k-NNN、 KDE 和 LS-SVM 内核LS-SVM 等方法, 我们的方法可以将运行时间减少一个数量级, 同时产生精确的解决方案。 我们也有类似的想法, 我们还可以用直线速度加速较难的轨迹。 最后, 我们将这些技术推广到最优化 k-NNCP 的回归中。 我们用经验来评估我们的调查结果, 并讨论在哪些方法适合CP 优化时讨论 。