Low Earth Orbit (LEO) satellite constellations have seen a surge in deployment over the past few years by virtue of their ability to provide broadband Internet access as well as to collect vast amounts of Earth observational data that can be utilized to develop AI on a global scale. As traditional machine learning (ML) approaches that train a model by downloading satellite data to a ground station (GS) are not practical, Federated Learning (FL) offers a potential solution. However, existing FL approaches cannot be readily applied because of their excessively prolonged training time caused by the challenging satellite-GS communication environment. This paper proposes FedHAP, which introduces high-altitude platforms (HAPs) as distributed parameter servers (PSs) into FL for Satcom (or more concretely LEO constellations), to achieve fast and efficient model training. FedHAP consists of three components: 1) a hierarchical communication architecture, 2) a model dissemination algorithm, and 3) a model aggregation algorithm. Our extensive simulations demonstrate that FedHAP significantly accelerates FL model convergence as compared to state-of-the-art baselines, cutting the training time from several days down to a few hours, yet achieving higher accuracy.
翻译:近些年来,低地球轨道卫星星座由于能够提供宽带互联网接入和收集大量地球观测数据,可用于在全球范围开发AI,因此部署量激增。传统机器学习(ML)方法通过将卫星数据下载到地面站(GS)来培训模型是不实际的,联邦学习(FL)提供了潜在的解决办法。然而,现有的FL方法由于具有挑战性的卫星-GS通信环境造成培训时间过长,因此无法轻易应用。本文提议FedHAP, 将高高度平台作为分布参数服务器(PS)引入Satcom(或更具体的低地轨道星座)的FL, 以实现快速有效的示范培训。FedHAP由三个部分组成:1) 等级通信结构,2) 模型传播算法,3) 模型汇总算法。我们的广泛模拟表明,FDHAP大大加快了FL模型与最先进的基线的趋同速度,将培训时间从数天缩短到几小时,但达到更高的精确度。