Federated learning (FL) hyper-parameters significantly affect the training overheads in terms of computation time, transmission time, computation load, and transmission load. However, the current practice of manually selecting FL hyper-parameters puts a high burden on FL practitioners since various applications prefer different training preferences. In this paper, we propose FedTune, an automatic FL hyper-parameter tuning algorithm tailored to applications' diverse system requirements of FL training. FedTune is lightweight and flexible, achieving 8.48%-26.75% improvement for different datasets compared to fixed FL hyper-parameters.
翻译:联邦学习(FL)超参数在计算时间、传输时间、计算负荷和传输负荷方面对培训间接费用产生重大影响,然而,目前人工选择FL超参数的做法给FL实践者带来沉重负担,因为各种应用偏好不同的培训偏好。在本论文中,我们提议FedTune(FedTune),根据应用对FL培训的各种系统要求,自动FL超参数调算法。FedTune(FedTune)是轻便和灵活的,与固定的FL超参数相比,不同数据集的改进率为8.48%-26.75%。