This work is devoted to solving the composite optimization problem with the mixture oracle: for the smooth part of the problem, we have access to the gradient, and for the non-smooth part, only to the one-point zero-order oracle. For such a setup, we present a new method based on the sliding algorithm. Our method allows to separate the oracle complexities and compute the gradient for one of the function as rarely as possible. The paper also present the applicability of our new method to the problems of distributed optimization and federated learning. Experimental results confirm the theory.
翻译:这项工作致力于用混合神器解决复合优化问题:对于问题的顺利部分,我们可以使用梯度,对于非摩擦部分,我们只能使用一点零顺序。对于这样一个设置,我们提出了一个基于滑动算法的新方法。我们的方法可以将甲骨文复杂性分开,并尽可能少地计算函数之一的梯度。文件还介绍了我们新方法对分布优化和饱和学习问题的适用性。实验结果证实了理论。