Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has utilized various data compression tools such as quantization and sparsification to reduce the overhead. In this paper, we propose a predictive coding based communication scheme for federated learning. The scheme has shared prediction functions among all devices and allows each worker to transmit a compressed residual vector derived from the reference. In each communication round, we select the predictor and quantizer based on the rate-distortion cost, and further reduce the redundancy with entropy coding. Extensive simulations reveal that the communication cost can be reduced up to 99% with even better learning performance when compared with other baseline methods.
翻译:联邦学习可以使远程工人合作训练一个共享的机器学习模式,同时允许将培训数据保存在本地。在使用无线移动设备的情况下,通信管理费用由于电力和带宽有限而成为关键的瓶颈。先前的工作利用了各种数据压缩工具,如量化和封闭,以减少管理费用。在本文件中,我们提议了一个基于预测的基于编码的通信计划,用于联邦学习。该计划在所有设备中共享了预测功能,并允许每个工人传输来自参考文献的压缩残余矢量。在每轮通信中,我们根据速度扭曲成本选择预测器和量化器,并进一步减少与酶编码的冗余。广泛的模拟表明,通信费用可以降低到99%,与其他基线方法相比,学习成绩甚至更好。