Federated learning (FL) has been recognized as a promising distributed learning paradigm to support intelligent applications at the wireless edge, where a global model is trained iteratively through the collaboration of the edge devices without sharing their data. However, due to the relatively large communication cost between the devices and parameter server (PS), direct computing based on the information from the devices may not be resource efficient. This paper studies the joint communication and learning design for the over-the-air computation (AirComp)-based two-tier wireless FL scheme, where the lead devices first collect the local gradients from their nearby subordinate devices, and then send the merged results to the PS for the second round of aggregation. We establish a convergence result for the proposed scheme and derive the upper bound on the optimality gap between the expected and optimal global loss values. Next, based on the device distance and data importance, we propose a hierarchical clustering method to build the two-tier structure. Then, with only the instantaneous channel state information (CSI), we formulate the optimality gap minimization problem and solve it by using an efficient alternating minimization method. Numerical results show that the proposed scheme outperforms the baseline ones.
翻译:联邦学习(FL)被公认为是支持无线边缘智能应用的有希望的分布式学习模式,在无线边缘,全球模型通过边缘设备的合作进行迭代培训,而没有分享数据;然而,由于设备与参数服务器(PS)之间的通信费用相对较高,基于设备所提供信息的直接计算可能不是资源效率高的。本文研究了基于超空计算(AirComp)的双层无线FL计划的联合通信和学习设计,即铅设备首先从附近下属设备收集本地梯度,然后将合并结果发送给PS,供第二轮汇总使用。我们为拟议方案确定一个趋同结果,并根据预期全球损失值与最佳损失值之间的最佳差值得出上限。接着,根据设备的距离和数据重要性,我们提出了建立双层结构的等级组合方法。然后,仅用即时频道状态信息(CSI),我们提出最佳性差距最小化问题,并通过高效的交替最小化方法加以解决。数字结果显示,拟议方案比基线值高。