Federated Learning (FL) is a machine learning approach that enables the creation of shared models for powerful applications while allowing data to remain on devices. This approach provides benefits such as improved data privacy, security, and reduced latency. However, in some systems, direct communication between clients and servers may not be possible, such as remote areas without proper communication infrastructure. To overcome this challenge, a new framework called FedEx (Federated Learning via Model Express Delivery) is proposed. This framework employs mobile transporters, such as UAVs, to establish indirect communication channels between the server and clients. These transporters act as intermediaries and allow for model information exchange. The use of indirect communication presents new challenges for convergence analysis and optimization, as the delay introduced by the transporters' movement creates issues for both global model dissemination and local model collection. To address this, two algorithms, FedEx-Sync and FedEx-Async, are proposed for synchronized and asynchronized learning at the transporter level. Additionally, a bi-level optimization algorithm is proposed to solve the joint client assignment and route planning problem. Experimental validation using two public datasets in a simulated network demonstrates consistent results with the theory, proving the efficacy of FedEx.
翻译:联邦学习是一种机器学习方法,可以创建共享模型以进行强大的应用程序,同时允许数据保留在设备上。这种方法提供了改进的数据隐私、安全性和减少延迟等优点。然而,在一些系统中,客户端和服务器之间的直接通信可能不可能,例如缺乏适当通信基础设施的偏远地区。为了克服这个挑战,提出了一种新的框架,称为联邦快递(Federated Learning via Model Express Delivery,缩写为FedEx)。该框架使用移动运输工具,例如 无人机,来建立服务器和客户端之间的间接通信渠道。这些运输工具充当中介,允许模型信息交换。间接通信的使用为收敛分析和优化带来了新的挑战,因为运输工具移动所引入的延迟会对全局模型传播和局部模型收集产生影响。为了解决这个问题,提出了两种算法FedEx-Sync 和 FedEx-Async,用于运输工具级别的同步和异步学习。此外,提出了一个双层优化算法来解决联合客户分配和路径规划问题。使用两个公共数据集在模拟网络中进行的实验验证了FedEx的一致结果,证明了FedEx的有效性。