This work is devoted to solving the composite optimization problem with the mixture oracle: for the smooth part of the problem, we have access to the gradient, and for the non-smooth part, only to the one-point zero-order oracle. We present a method based on the sliding algorithm. Our method allows to separate the oracle complexities and compute the gradient for one of the function as rarely as possible. The paper also examines the applicability of this method to the problems of distributed optimization and federated learning.
翻译:这项工作致力于用混合神器解决复合优化问题:对于问题的顺利部分,我们可以使用梯度,对于非摩擦部分,只能使用一点零顺序。我们提出了一个基于滑动算法的方法。我们的方法可以将甲骨文的复杂性分开,并尽可能少地计算函数之一的梯度。文件还审查了这种方法对分布式优化和联合学习问题的适用性。