Federated learning (FL) is a distributed model training paradigm that preserves clients' data privacy. FL hyper-parameters significantly affect the training overheads in terms of computation time, transmission time, computation load, and transmission load. However, the current practice of manually selecting FL hyper-parameters puts a high burden on FL practitioners since various applications prefer different training preferences. In this paper, we propose FedTuning, an automatic FL hyper-parameter tuning algorithm tailored to applications' diverse system requirements of FL training. FedTuning is lightweight and flexible, achieving an average of 22.48% improvement for different training preferences compared to fixed FL hyper-parameters. FedTuning is available at https://github.com/dtczhl/FedTuning.
翻译:联邦学习(FL)是一种分布式示范培训模式,它保护客户的数据隐私。FL超参数在计算时间、传输时间、计算负荷和传输负荷方面对培训间接费用产生重大影响。然而,目前人工选择FL超参数的做法给FL实践者带来沉重负担,因为各种应用偏好不同的培训偏好。我们在此文件中提议FedTuning(FedTuning),一种FL超参数自动调整算法,适应应用程序对FL培训的不同系统要求。FedTuning(FedTuning)是轻巧和灵活的,与固定的FL超参数相比,在不同的培训偏好方面实现了平均22.48%的改进。FedTuning可在https://github.com/dtczl/FedTuning上查阅。