Federated learning (FL) is an effective technique to directly involve edge devices in machine learning training while preserving client privacy. However, the substantial communication overhead of FL makes training challenging when edge devices have limited network bandwidth. Existing work to optimize FL bandwidth overlooks downstream transmission and does not account for FL client sampling. In this paper we propose GlueFL, a framework that incorporates new client sampling and model compression algorithms to mitigate low download bandwidths of FL clients. GlueFL prioritizes recently used clients and bounds the number of changed positions in compression masks in each round. Across three popular FL datasets and three state-of-the-art strategies, GlueFL reduces downstream client bandwidth by 27% on average and reduces training time by 29% on average.
翻译:联邦学习(FL)是直接将边缘装置纳入机器学习培训并同时保护客户隐私的有效技术。然而,在边缘装置具有有限的网络带宽时,FL的大量通信管理使培训具有挑战性。优化FL带宽的现有工作忽略了下游传输,而没有考虑到FL客户抽样。在本论文中,我们提议GlueFL这个包含新的客户抽样和模型压缩算法的框架,以缓解FL客户的低下载带宽。GlueFL将最近使用的客户列为优先事项,并限制每轮压缩面罩中改变位置的数目。在三种受欢迎的FL数据集和三种最先进的战略中,GlueFL平均将下游客户带宽减少27%,平均减少培训时间29%。