We study the problem of uncertainty quantification via prediction sets, in an online setting where the data distribution may vary arbitrarily over time. Recent work develops online conformal prediction techniques that leverage regret minimization algorithms from the online learning literature to learn prediction sets with approximately valid coverage and small regret. However, standard regret minimization could be insufficient for handling changing environments, where performance guarantees may be desired not only over the full time horizon but also in all (sub-)intervals of time. We develop new online conformal prediction methods that minimize the strongly adaptive regret, which measures the worst-case regret over all intervals of a fixed length. We prove that our methods achieve near-optimal strongly adaptive regret for all interval lengths simultaneously, and approximately valid coverage. Experiments show that our methods consistently obtain better coverage and smaller prediction sets than existing methods on real-world tasks, such as time series forecasting and image classification under distribution shift.
翻译:我们研究通过预测集量化不确定性的问题,在在线环境中,数据分配可能随时间而任意变化。最近的工作开发了在线一致预测技术,利用在线学习文献的最小化遗憾算法学习大致有效覆盖和少量遗憾的预测组。然而,标准的最小化可能不足以处理变化中的环境,在这些环境中,不仅在全时范围,而且在所有(次)时间间隔中,都希望得到绩效保障。我们开发了新的在线一致预测方法,最大限度地减少强烈适应性的遗憾,以衡量在固定长度的所有间隔中最坏的遗憾。我们证明,我们的方法在各种间隔期间都实现了几乎最佳的高度适应性遗憾,并且几乎是有效的覆盖。实验表明,我们的方法在实际任务上,例如时间序列预测和分布变化中的图像分类,始终得到了更好的覆盖和较小的预测组。