Sixth-Generation (6G)-based Internet of Everything applications (e.g. autonomous driving cars) have witnessed a remarkable interest. Autonomous driving cars using federated learning (FL) has the ability to enable different smart services. Although FL implements distributed machine learning model training without the requirement to move the data of devices to a centralized server, it its own implementation challenges such as robustness, centralized server security, communication resources constraints, and privacy leakage due to the capability of a malicious aggregation server to infer sensitive information of end-devices. To address the aforementioned limitations, a dispersed federated learning (DFL) framework for autonomous driving cars is proposed to offer robust, communication resource-efficient, and privacy-aware learning. A mixed-integer non-linear (MINLP) optimization problem is formulated to jointly minimize the loss in federated learning model accuracy due to packet errors and transmission latency. Due to the NP-hard and non-convex nature of the formulated MINLP problem, we propose the Block Successive Upper-bound Minimization (BSUM) based solution. Furthermore, the performance comparison of the proposed scheme with three baseline schemes has been carried out. Extensive numerical results are provided to show the validity of the proposed BSUM-based scheme.
翻译:使用联合学习(FL)的自主驾驶汽车能够提供各种智能服务。尽管FL实施分散的机器学习模式培训,而不需要将设备数据转移到中央服务器,但它本身的实施挑战,如强力、中央服务器安全、通信资源限制和隐私泄漏,因为恶意聚合服务器能够推断最终装置的敏感信息。为解决上述限制,提议为自主驾驶汽车建立一个分散的联邦学习框架,以提供稳健、通信资源高效和隐私意识学习。制定了混合内插非线性优化问题,以共同尽量减少由于包装错误和传输延迟而造成的联动学习模式准确性损失。由于已拟订的MINLP问题的NP硬性和非连接性,我们提议采用基于BUM的阻隔式成功上行最大最小化(DFL)框架,以提供强有力、通信资源高效和有隐私意识的学习框架。此外,拟议的BUMS基准计划的业绩对比与拟议的三个基准计划已经进行。