Federated learning (FL) is a distributed model training paradigm that preserves clients' data privacy. FL hyper-parameters significantly affect the training overheads in terms of time, computation, and communication. However, the current practice of manually selecting FL hyper-parameters puts a high burden on FL practitioners since various applications prefer different training preferences. In this paper, we propose FedTuning, an automatic FL hyper-parameter tuning algorithm tailored to applications' diverse system requirements of FL training. FedTuning is lightweight and flexible, achieving an average of 41% improvement for different training preferences on time, computation, and communication compared to fixed FL hyper-parameters. FedTuning is available at https://github.com/dtczhl/FedTuning.
翻译:联邦学习(FL)是一种分布式模式培训模式,它保护客户的数据隐私;FL超参数在时间、计算和通信方面对培训间接费用产生重大影响;然而,目前人工选择FL超参数的做法给FL实践者带来沉重负担,因为各种应用偏爱不同的培训偏好;在本文件中,我们提议FedTurning,即自动FL超参数调算法,适合应用FL培训的各种系统要求;FedTuring是轻便和灵活的,与固定FL超参数相比,对时间、计算和通信的不同培训偏好平均改善了41%。 FedTurning可在https://github.com/dtczl/FedTurning上查阅。