In this paper, we investigate unconstrained and constrained sample-based federated optimization, respectively. For each problem, we propose a privacy preserving algorithm using stochastic successive convex approximation (SSCA) techniques, and show that it can converge to a Karush-Kuhn-Tucker (KKT) point. To the best of our knowledge, SSCA has not been used for solving federated optimization, and federated optimization with nonconvex constraints has not been investigated. Next, we customize the two proposed SSCA-based algorithms to two application examples, and provide closed-form solutions for the respective approximate convex problems at each iteration of SSCA. Finally, numerical experiments demonstrate inherent advantages of the proposed algorithms in terms of convergence speed, communication cost and model specification.
翻译:在本文中,我们分别调查了不受限制和受限制的基于样本的联邦优化。 对于每一个问题,我们建议使用随机连续相近(SSCA)技术进行隐私保护算法,并表明它可以与Karush-Kuhn-Tucker(KKT)点相交。据我们所知,SSCA没有被用于解决联邦优化,而没有调查非convex限制的结合优化。接下来,我们将两种基于SCA的拟议算法定制为两个应用实例,并为SCA的每一次迭代中各自的近似连接问题提供封闭式解决方案。最后,数字实验显示了拟议算法在趋同速度、通信成本和模型规格方面的固有优势。