As an edge intelligence algorithm for multi-device collaborative training, federated learning (FL) can reduce the communication burden but increase the computing load of wireless devices. In contrast, split learning (SL) can reduce the computing load of devices by using model splitting and assignment, but increase the communication burden to transmit intermediate results. In this paper, to exploit the advantages of FL and SL, we propose a hybrid federated split learning (HFSL) framework in wireless networks, which combines the multi-worker parallel update of FL and flexible splitting of SL. To reduce the computational idleness in model splitting, we design a parallel computing scheme for model splitting without label sharing, and theoretically analyze the influence of the delayed gradient caused by the scheme on the convergence speed. Aiming to obtain the trade-off between the training time and energy consumption, we optimize the splitting decision, the bandwidth and computing resource allocation. The optimization problem is multi-objective, and we thus propose a predictive generative adversarial network (GAN)-powered multi-objective optimization algorithm to obtain the Pareto front of the problem. Experimental results show that the proposed algorithm outperforms others in finding Pareto optimal solutions, and the solutions of the proposed HFSL dominate the solution of FL.
翻译:作为多设备合作培训的边缘情报算法,联谊学习(FL)可以减少通信负担,但增加无线设备的计算负荷。相反,分式学习(SL)可以通过使用模式分割和分配来减少设备的计算负荷,但增加传播中间结果的通信负担。在本文中,为了利用FL和SL的优势,我们提议在无线网络中建立一个混合联合的分解学习框架,将多工作者平行更新FL和SL的灵活分割结合起来。为了减少模式分裂中的计算空闲,我们设计了一个模型分割的平行计算计划,不分享标签,并从理论上分析这个计划对趋同速度造成的延迟梯度的影响。为了在培训时间和能源消耗之间取得取舍,我们优化了分解决定、带宽度和计算资源分配。优化问题是多方面的,因此我们提议了一种预测性的对称对抗网络(GAN)动力多目标优化算法,以获得问题的Pareto前方解决方案。实验结果显示,在最佳法L法中找到最佳解决办法。