Federated learning (FL) hyper-parameters significantly affect the training overheads in terms of computation time, transmission time, computation load, and transmission load. However, the current practice of manually selecting FL hyper-parameters puts a high burden on FL practitioners since various applications prefer different training preferences. In this paper, we propose FedTune, an automatic FL hyper-parameter tuning algorithm tailored to applications' diverse system requirements of FL training. FedTune is lightweight and flexible, achieving 4.18%-22.48% improvement for different datasets compared to fixed FL hyper-parameters. FedTune is available at \url{https://github.com/dtczhl/FedTuning}.
翻译:联邦学习(FL)超参数在计算时间、传输时间、计算负荷和传输负荷方面对培训间接费用产生重大影响,然而,目前人工选择FL超参数的做法给FL实践者带来沉重负担,因为各种应用偏好不同的培训偏好。我们在此文件中提议FedTune, 一种自动FL超参数调算法,适合应用FL培训的各种系统要求。 FedTune是轻便和灵活的,与固定的FL超参数相比,不同数据集的改进率为4.18%-22.48%。 FedTune可在\url{https://github.com/dtczl/FedTurning}查阅。