Federated Learning (FL) is a communication-efficient and privacy-preserving distributed machine learning framework that has gained a significant amount of research attention recently. Despite the different forms of FL algorithms (e.g., synchronous FL, asynchronous FL) and the underlying optimization methods, nearly all existing works implicitly assumed the existence of a communication infrastructure that facilitates the direct communication between the server and the clients for the model data exchange. This assumption, however, does not hold in many real-world applications that can benefit from distributed learning but lack a proper communication infrastructure (e.g., smart sensing in remote areas). In this paper, we propose a novel FL framework, named FedEx (short for FL via Model Express Delivery), that utilizes mobile transporters (e.g., Unmanned Aerial Vehicles) to establish indirect communication channels between the server and the clients. Two algorithms, called FedEx-Sync and FedEx-Async, are developed depending on whether the transporters adopt a synchronized or an asynchronized schedule. Even though the indirect communications introduce heterogeneous delays to clients for both the global model dissemination and the local model collection, we prove the convergence of both versions of FedEx. The convergence analysis subsequently sheds lights on how to assign clients to different transporters and design the routes among the clients. The performance of FedEx is evaluated through experiments in a simulated network on two public datasets.
翻译:联邦学习联合会(FL)是一个通信高效和隐私保护的分布式机器学习框架,最近引起了大量的研究关注。尽管有不同形式的FL算法(如同步FL、非同步FL、非同步FL)和基本优化方法,几乎所有现有工程都暗含地假定存在通信基础设施,便利服务器和客户之间为数据交换模式直接通信。然而,这一假设在许多现实世界应用程序中并不存在,这些应用程序可受益于分布式学习,但缺乏适当的模拟通信基础设施(如偏远地区的智能感测 ) 。在本文件中,我们提出一个新的FL框架,名为FDEx(通过模式快递对FL来说很短),利用移动运输工具(如无人驾驶航空飞行器)建立服务器和客户之间的间接通信渠道。两种算法,称为FedEx-Sync和FedEx-Ex-Async,是根据运输公司是否采用同步或同步时间表开发的。尽管间接通信给客户引入了混合延迟的FedEx框架框架框架框架框架框架框架框架,但随后又向客户发送了不同版本的版本的版本的版本的数据,通过Fed Ex 向客户发送格式的版本的版本的版本的版本的数据分析。