Communication efficiency is of importance for wireless federated learning systems. In this paper, we propose a communication-efficient strategy for federated learning over multiple-input multiple-output (MIMO) multiple access channels (MACs). The proposed strategy comprises two components. When sending a locally computed gradient, each device compresses a high dimensional local gradient to multiple lower-dimensional gradient vectors using block sparsification. When receiving a superposition of the compressed local gradients via a MIMO-MAC, a parameter server (PS) performs a joint MIMO detection and the sparse local-gradient recovery. Inspired by the turbo decoding principle, our joint detection-and-recovery algorithm accurately recovers the high-dimensional local gradients by iteratively exchanging their beliefs for MIMO detection and sparse local gradient recovery outputs. We then analyze the reconstruction error of the proposed algorithm and its impact on the convergence rate of federated learning. From simulations, our gradient compression and joint detection-and-recovery methods diminish the communication cost significantly while achieving identical classification accuracy for the case without any compression.
翻译:通信效率对于无线联结式学习系统非常重要。 在本文中,我们提出了针对多投入多输出多输出多接入渠道(MIMO)进行联合学习的通信高效战略。 拟议的战略由两个部分组成。 当发送本地计算梯度时, 每个设备都用块隔热法将高维本地梯度压缩到多个低维梯度矢量中。 当通过 MIMO- MAC(一个参数服务器)接收压缩本地梯度的叠加时, 一个参数服务器(PS) 进行联合IMO检测并进行稀疏的本地分层恢复。 在涡轮分解原则的启发下, 我们的联合检测和回收算法通过反复交换其对IMO检测和本地稀疏梯度回收产出的信念,准确恢复了高维本地梯度梯度。 然后我们分析拟议算法的重建错误及其对节化学习汇率的影响。 从模拟中, 我们的梯度压缩和联合检测和回收方法大大降低了通信成本,同时在不作任何压缩的情况下实现相同的分类准确性。