Federated learning is a promising framework to mitigate data privacy and computation concerns. However, the communication cost between the server and clients has become the major bottleneck for successful deployment. Despite notable progress in gradient compression, the existing quantization methods require further improvement when low-bits compression is applied, especially the overall systems often degenerate a lot when quantization are applied in double directions to compress model weights and gradients. In this work, we propose a simple cosine-based nonlinear quantization and achieve impressive results in compressing round-trip communication costs. We are not only able to compress model weights and gradients at higher ratios than previous methods, but also achieve competing model performance at the same time. Further, our approach is highly suitable for federated learning problems since it has low computational complexity and requires only a little additional data to recover the compressed information. Extensive experiments have been conducted on image classification and brain tumor semantic segmentation using the CIFAR-10, and BraTS datasets where we show state-of-the-art effectiveness and impressive communication efficiency.
翻译:联邦学习是减少数据隐私和计算问题的一个大有希望的框架,然而,服务器和客户之间的通信成本已成为成功部署的主要瓶颈。尽管在梯度压缩方面取得了显著进展,但是在应用低位压缩时,现有的量化方法需要进一步改进,特别是总体系统往往会大量退化,因为对压缩模型重量和梯度采用双向的量化方法。在这项工作中,我们提出了一个简单的基于共振的非线性非线性量化办法,并在压缩循环通信成本方面取得了令人印象深刻的结果。我们不仅能够以高于以往方法的比率压缩模型重量和梯度,而且能够同时实现竞争性模型性能。此外,我们的方法非常适合联邦化学习问题,因为它的计算复杂性低,只需要少量额外数据来恢复压缩的信息。我们利用CIFAR-10和BATS数据集对图像分类和脑肿瘤断裂进行了广泛的实验,我们在那里展示了最先进的效果和令人印象深刻的通信效率。