We explore a scheme that enables the training of a deep neural network in a Federated Learning configuration over an additive white Gaussian noise channel. The goal is to create a low complexity, linear compression strategy, called PolarAir, that reduces the size of the gradient at the user side to lower the number of channel uses needed to transmit it. The suggested approach belongs to the family of compressed sensing techniques, yet it constructs the sensing matrix and the recovery procedure using multiple access techniques. Simulations show that it can reduce the number of channel uses by ~30% when compared to conveying the gradient without compression. The main advantage of the proposed scheme over other schemes in the literature is its low time complexity. We also investigate the behavior of gradient updates and the performance of PolarAir throughout the training process to obtain insight on how best to construct this compression scheme based on compressed sensing.
翻译:我们探索一个方案,使一个深神经网络能够在一个联邦学习配置中在一个添加的白色高斯噪声频道上进行深神经网络培训。目标是建立一个叫做极地轴的低复杂性线性压缩战略,降低用户侧梯度的大小,以降低传输该梯度所需的频道使用量。建议的方法属于压缩遥感技术组,但它使用多种存取技术构建了感测矩阵和恢复程序。模拟显示,与不压缩传递梯度相比,它可以减少频道使用量~30%。拟议办法相对于文献中其他计划的主要优势是时间复杂性低。我们还调查了整个培训过程中梯度更新行为和极地心仪的性能,以了解如何最好地构建基于压缩感测的压缩计划。