Wireless federated learning (FL) is an emerging machine learning paradigm that trains a global parametric model from distributed datasets via wireless communications. This paper proposes a unit-modulus wireless FL (UMWFL) framework, which simultaneously uploads local model parameters and computes global model parameters via optimized phase shifting. The proposed framework avoids sophisticated baseband signal processing, leading to both low communication delays and implementation costs. A training loss bound is derived and a penalty alternating minimization (PAM) algorithm is proposed to minimize the nonconvex nonsmooth loss bound. Experimental results in the Car Learning to Act (CARLA) platform show that the proposed UMWFL framework with PAM algorithm achieves smaller training losses and testing errors than those of the benchmark scheme.
翻译:无线联盟学习(FL)是一个新兴的机器学习模式,它通过无线通信从分布式数据集中培训一个全球参数模型。本文提议了一个单元模范无线FL(UMWFL)框架,同时上传本地模型参数,并通过优化阶段转换计算全球模型参数。拟议框架避免了复杂的基带信号处理,导致通信延迟率和执行成本降低。培训损失是必然的,并提议了一种交替最小化算法(PAM),以尽量减少非convex非移动损失。汽车学习到Action(CARLA)平台的实验结果显示,拟议的UMWFL框架与PAM算法相比,培训损失和测试错误要小于基准计划。