We introduce a general framework of stochastic online convex optimization to obtain fast-rate stochastic regret bounds. We prove that algorithms such as online newton steps and a scale-free 10 version of Bernstein online aggregation achieve best-known rates in unbounded stochastic settings. We apply our approach to calibrate parametric probabilistic forecasters of non-stationary sub-gaussian time series. Our fast-rate stochastic regret bounds are any-time valid. Our proofs combine self-bounded and Poissonnian inequalities for martingales and sub-gaussian random variables, respectively, under a stochastic exp-concavity assumption.
翻译:翻译摘要:
我们引入了一种随机在线凸最优化的通用框架,以获得快速的随机后悔界限。我们证明诸如在线牛顿步骤和Berstein online聚合的算法在不受限随机环境中实现了已知最佳速率。我们将此方法应用于校准非平稳亚高斯时间序列的参数概率预测器。我们的快速速率随机后悔边界在任何时候都有效。我们的证明结合了随机exp-concavity假设下的具有自限值和泊松统计不等式的鞅和亚高斯随机变量。