Federated Learning (FL) can be used in mobile edge networks to train machine learning models in a distributed manner. Recently, FL has been interpreted within a Model-Agnostic Meta-Learning (MAML) framework, which brings FL significant advantages in fast adaptation and convergence over heterogeneous datasets. However, existing research simply combines MAML and FL without explicitly addressing how much benefit MAML brings to FL and how to maximize such benefit over mobile edge networks. In this paper, we quantify the benefit from two aspects: optimizing FL hyperparameters (i.e., sampled data size and the number of communication rounds) and resource allocation (i.e., transmit power) in mobile edge networks. Specifically, we formulate the MAML-based FL design as an overall learning time minimization problem, under the constraints of model accuracy and energy consumption. Facilitated by the convergence analysis of MAML-based FL, we decompose the formulated problem and then solve it using analytical solutions and the coordinate descent method. With the obtained FL hyperparameters and resource allocation, we design a MAML-based FL algorithm, called Automated Federated Learning (AutoFL), that is able to conduct fast adaptation and convergence. Extensive experimental results verify that AutoFL outperforms other benchmark algorithms regarding the learning time and convergence performance.
翻译:----
联邦学习(FL)可在移动边缘网络中用于分布式地训练机器学习模型。最近,FL已在模型无关元学习(MAML)框架中进行了解释,这为FL带来了在异构数据集上快速适应和收敛的显着优势。然而,现有的研究仅仅将MAML和FL结合起来,没有明确解决MAML为FL带来多大的好处以及如何在移动边缘网络上最大化这些好处。在本文中,我们从两个方面量化MAML带来的好处:优化FL超参数(即,采样数据大小和通信轮数)和资源分配(即,传输功率),并将基于MAML的FL设计制定为最小化整体学习时间的问题,在模型准确性和能量消耗的约束条件下。借助MAML-based FL的收敛分析,我们将问题分解,并使用解析解和坐标下降法解决它。通过获得的FL超参数和资源分配,我们设计了一种基于MAML的FL算法,称为自动化联邦学习(AutoFL),能够进行快速适应和收敛。大量实验结果验证了AutoFL在学习时间和收敛性能方面优于其他基准算法。