In this paper, a green, quantized FL framework, which represents data with a finite precision level in both local training and uplink transmission, is proposed. Here, the finite precision level is captured through the use of quantized neural networks (QNNs) that quantize weights and activations in fixed-precision format. In the considered FL model, each device trains its QNN and transmits a quantized training result to the base station. Energy models for the local training and the transmission with quantization are rigorously derived. To minimize the energy consumption and the number of communication rounds simultaneously, a multi-objective optimization problem is formulated with respect to the number of local iterations, the number of selected devices, and the precision levels for both local training and transmission while ensuring convergence under a target accuracy constraint. To solve this problem, the convergence rate of the proposed FL system is analytically derived with respect to the system control variables. Then, the Pareto boundary of the problem is characterized to provide efficient solutions using the normal boundary inspection method. Design insights on balancing the tradeoff between the two objectives are drawn from using the Nash bargaining solution and analyzing the derived convergence rate. Simulation results show that the proposed FL framework can reduce energy consumption until convergence by up to 52% compared to a baseline FL algorithm that represents data with full precision.
翻译:在本文中,提议了一个绿色的、量化的FL框架,它代表了本地培训和上链接传输中具有有限精确度的数据;在这里,通过使用量化的神经网络(QNN),对权重进行量化,并以固定精密度格式启动;在考虑的FL模型中,每个设备都对其QNNN进行培训,并将一个量化的培训结果传送给基地站;严格地从当地培训的能源模型和以四分制传输的方法中得出严格的本地培训的能量模型;为同时尽量减少能源消耗和通信回合的数量,通过使用本地迭代的数量、选定装置的数量以及本地培训和传输的精确度,在目标精确度限制下,对重量和启动进行量化;在考虑的FL模型模型中,每个设备将对其QNNNNN进行训练,并将一个量化的培训结果传送给基准站;然后,问题Pareto边界的特征是利用正常的边界检查方法提供高效的解决办法;为平衡两个目标之间的权衡而设计出一个多目标之间的平衡的多点优化问题,即当地迭差、选定了当地迭错数、选定了当地迭接点的数目、选定了当地迭值、选定设备的数目、选定了当地迭合点装置的数目、选定了使用FNALxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx