Distributed optimization methods are actively researched by optimization community. Due to applications in distributed machine learning, modern research directions include stochastic objectives, reducing communication frequency, and time-varying communication network topology. Recently, an analysis unifying several centralized and decentralized approaches to stochastic distributed optimization was developed in Koloskova et al. (2020). In this work, we employ a Catalyst framework and accelerate the rates of Koloskova et al. (2020) in the case of low stochastic noise.
翻译:由于在分布式机器学习中的应用,现代研究方向包括随机目标、减少通信频率和时间变化通信网络地形学。最近,在Koloskova等人(202020年)开展了一项分析,将若干集中和分散的随机分布优化方法统一起来。在这项工作中,我们采用了催化框架,在低随机噪音的情况下加快了Koloskova等人(202020年)的速率。