We propose ADOM - an accelerated method for smooth and strongly convex decentralized optimization over time-varying networks. ADOM uses a dual oracle, i.e., we assume access to the gradient of the Fenchel conjugate of the individual loss functions. Up to a constant factor, which depends on the network structure only, its communication complexity is the same as that of accelerated Nesterov gradient method (Nesterov, 2003). To the best of our knowledge, only the algorithm of Rogozin et al. (2019) has a convergence rate with similar properties. However, their algorithm converges under the very restrictive assumption that the number of network changes can not be greater than a tiny percentage of the number of iterations. This assumption is hard to satisfy in practice, as the network topology changes usually can not be controlled. In contrast, ADOM merely requires the network to stay connected throughout time.
翻译:我们建议ADOM(ADOM) -- -- 在一个时间变化的网络中进行平稳和强烈的分散化优化的加速方法;ADOM(ADOM)使用一种双重的预言,即,我们假定使用单个损失功能的Fenchel组合的梯度;直到一个仅取决于网络结构的恒定因素,其通信复杂性与加速Nesterov梯度法(Nesterov,2003年)相同(据我们所知,只有Rogozin等人的算法(2019年)与类似特性的趋同率。然而,它们的算法却在非常限制性的假设下汇合在一起,即网络变化的数量不能大于迭代数的很小百分比。这一假设在实践中难以满足,因为网络的地形变化通常无法控制。相比之下,ADOM(ADOM)只是要求网络保持整个时间连接。