The provision of communication services via portable and mobile devices, such as aerial base stations, is a crucial concept to be realized in 5G/6G networks. Conventionally, IoT/edge devices need to transmit the data directly to the base station for training the model using machine learning techniques. The data transmission introduces privacy issues that might lead to security concerns and monetary losses. Recently, Federated learning was proposed to partially solve privacy issues via model-sharing with base station. However, the centralized nature of federated learning only allow the devices within the vicinity of base stations to share the trained models. Furthermore, the long-range communication compels the devices to increase transmission power, which raises the energy efficiency concerns. In this work, we propose distributed federated learning (DBFL) framework that overcomes the connectivity and energy efficiency issues for distant devices. The DBFL framework is compatible with mobile edge computing architecture that connects the devices in a distributed manner using clustering protocols. Experimental results show that the framework increases the classification performance by 7.4\% in comparison to conventional federated learning while reducing the energy consumption.
翻译:在5G/6G网络中,通过便携式和移动设备(如空基站)提供通信服务是一个关键的概念,需要5G/6G网络实现。在公约中,IoT/前沿设备需要将数据直接传送到基站,以便使用机器学习技术对模型进行培训。数据传输提出了可能导致安全关切和金钱损失的隐私问题。最近,联邦学习提议通过与基地站分享模型,部分解决隐私问题。但是,联邦学习的集中性质只允许基地台附近的设备共享经过培训的模型。此外,远程通信迫使设备增加传输能力,这引起了能源效率问题。在这个工作中,我们建议了分散的联邦学习框架,克服远程设备的连通性和能源效率问题。DBFFL框架与移动边缘计算结构兼容,该结构利用集群协议将设备以分布方式连接起来。实验结果显示,该框架在减少能源消耗的同时,提高了与常规联邦学习相比的分类性绩效。