Conformal Predictors (CP) are wrappers around ML models, providing error guarantees under weak assumptions on the data distribution. They are suitable for a wide range of problems, from classification and regression to anomaly detection. Unfortunately, their very high computational complexity limits their applicability to large datasets. In this work, we show that it is possible to speed up a CP classifier considerably, by studying it in conjunction with the underlying ML method, and by exploiting incremental&decremental learning. For methods such as k-NN, KDE, and kernel LS-SVM, our approach reduces the running time by one order of magnitude, whilst producing exact solutions. With similar ideas, we also achieve a linear speed up for the harder case of bootstrapping. Finally, we extend these techniques to improve upon an optimization of k-NN CP for regression. We evaluate our findings empirically, and discuss when methods are suitable for CP optimization.
翻译:圆形预报员(CP)是围绕 ML 模型的包装器,在数据分布的薄弱假设下提供错误保证。它们适合于从分类和回归到反常检测等一系列广泛的问题。 不幸的是,它们非常复杂的计算方法限制了它们对大型数据集的适用性。 在这项工作中,我们表明,可以通过结合基本 ML 方法研究来大大加快CP 分类器的速度,并利用递增和递减学习。对于 k-NNN、 KDE 和内核 LS-SVM 等方法,我们的方法可以将运行时间减少一个数量级,同时产生精确的解决方案。我们也有类似的想法,我们还可以用直线速度加速较难的轨迹。最后,我们在优化 k-NNCP 的回归方法后,将这些技术推广到改进。我们用经验来评估我们的调查结果,并讨论在哪些方法适合优化CP 的方法时讨论。