We consider the problem of minimizing a convex function that is evolving in time according to unknown and possibly stochastic dynamics. Such problems abound in the machine learning and signal processing literature, under the names of concept drift and stochastic tracking. We provide novel non-asymptotic convergence guarantees for stochastic algorithms with iterate averaging, focusing on bounds valid both in expectation and with high probability. Notably, we show that the tracking efficiency of the proximal stochastic gradient method depends only logarithmically on the initialization quality, when equipped with a step-decay schedule. The results moreover naturally extend to settings where the dynamics depend jointly on time and on the decision variable itself, as in the performative prediction framework.
翻译:我们考虑如何尽量减少一个根据未知和可能的随机动态在时间上演进的 convex 函数的问题。这类问题在机器学习和信号处理文献中,以概念漂移和随机跟踪的名称出现。我们为平均循环的随机算法提供了新的非随机趋同保证,侧重于预期和概率都很高的有效界限。值得注意的是,我们显示,对准随机随机切换梯度方法的跟踪效率仅取决于初始化质量,而初始化质量则配备了分步衰减时间表。结果自然延伸到动态同时取决于时间和决定变量本身的环境,如性能预测框架。