Federated learning has generated significant interest, with nearly all works focused on a "star" topology where nodes/devices are each connected to a central server. We migrate away from this architecture and extend it through the network dimension to the case where there are multiple layers of nodes between the end devices and the server. Specifically, we develop multi-stage hybrid federated learning (MH-FL), a hybrid of intra- and inter-layer model learning that considers the network as a multi-layer cluster-based structure. MH-FL considers the topology structures among the nodes in the clusters, including local networks formed via device-to-device (D2D) communications, and presumes a semi-decentralized architecture for federated learning. It orchestrates the devices at different network layers in a collaborative/cooperative manner (i.e., using D2D interactions) to form local consensus on the model parameters and combines it with multi-stage parameter relaying between layers of the tree-shaped hierarchy. We derive the upper bound of convergence for MH-FL with respect to parameters of the network topology (e.g., the spectral radius) and the learning algorithm (e.g., the number of D2D rounds in different clusters). We obtain a set of policies for the D2D rounds at different clusters to guarantee either a finite optimality gap or convergence to the global optimum. We then develop a distributed control algorithm for MH-FL to tune the D2D rounds in each cluster over time to meet specific convergence criteria. Our experiments on real-world datasets verify our analytical results and demonstrate the advantages of MH-FL in terms of resource utilization metrics.
翻译:联邦学习引起了极大的兴趣,几乎所有的工程都集中在“星”表层学上,节点/装置都与中央服务器连接。我们从这个结构向这个结构迁移,并通过网络层面扩展到终端设备与服务器之间存在多层节点的情况。具体地说,我们开发了多阶段混合混合学习(MH-FL),这是一种将网络视为多层集群结构结构的跨层和跨层模型学习的混合体。MH-FL考虑各组节点之间的表层学结构,包括通过设备对构件(D2D)通信建立的地方网络,并假设一个半分散化的网络层面,用于联合学习。我们以协作/合作的方式(即使用D2D互动)在不同网络层组点上调出设备,以形成关于模型参数的当地共识,并结合多种阶段参数,在树型层层次结构之间相互衔接。我们将MH-FL结果与网络表层(即通过设备对D级构件结构的分析优势、对D级数据周期的升级值进行升级)参数的升级。我们将M-FL结果与S-roadalal-al-al-al-dealdal-deal-deal-de-de dalview-dealview-de dald-dealdaldalview dald-de 和D-sald-de-s-s-de-slviolviolviolviolviolvicald-s-sxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx