联邦学习(Federated Learning)是一种新兴的人工智能基础技术,在 2016 年由谷歌最先提出,原本用于解决安卓手机终端用户在本地更新模型的问题,其设计目标是在保障大数据交换时的信息安全、保护终端数据和个人数据隐私、保证合法合规的前提下,在多参与方或多计算结点之间开展高效率的机器学习。其中,联邦学习可使用的机器学习算法不局限于神经网络,还包括随机森林等重要算法。联邦学习有望成为下一代人工智能协同算法和协作网络的基础。

VIP内容

目前,以5G系统为代表的电信网络已经实现“万物互联”,并将朝着“万物智联”的目标发展。电信网络利用先进的人工智能技术通过及时有效地收集、传输、并随时随地学习数据,用于大量创新应用和智能服务。然而,基于中央服务器与数据中心的机器学习框架正遭受越来越多的数据隐私和安全挑战,面临巨大的通信开销与算力浪费。

联邦学习作为新兴的分布式机器学习框架,能够在保护数据隐私、满足合法合规的前提下,可使多个数据拥有方协同建立共享模型,达到模型训练与隐私保护双赢的目的,有望在电信领域中发挥巨大潜能。

在此背景下,该白皮书对联邦学习应用于电信行业的技术潜力与应用前景进行了分析,并介绍了电信联邦学习技术架构、技术分类、部署框架与关键优化技术等内容。此外,白皮书涵盖了中国移动通信有限公司研究院、联通数字科技有限公司与华为有限公司目前在电信领域应用联邦学习技术的多个典型use case,包括基于横向联邦学习的ONT精准识别应用、基于横向联邦学习的防未知网站注入攻击检测、基于纵向联邦学习的消费金融应用、基于纵向联邦学习的5G网络QoE评估和预测、基于联邦迁移学习的数据中心PUE控制等。

电信领域联邦学习的发展与落地应用尚处于发展初期,白皮书针对性提出,通过需求牵引提升关键技术,强化电信联邦学习标准与测评工作,加快电信联邦学习落地应用与产业发展,实现电信领域联邦学习关键技术突破,推动电信网络内在智能发展。

联邦学习在电信领域的应用将会加速人工智能技术的创新发展,催生以运营商为中心的跨领域生态合作。可以预见,联邦学习在未来的自动驾驶网络、边缘计算、物联网、车联网、用户体验提升以及垂直行业等领域具备广阔的应用前景。

http://aiiaorg.cn/uploadfile/2021/0930/20210930015230641.pdf

成为VIP会员查看完整内容
0
16

最新内容

Today data is often scattered among billions of resource-constrained edge devices with security and privacy constraints. Federated Learning (FL) has emerged as a viable solution to learn a global model while keeping data private, but the model complexity of FL is impeded by the computation resources of edge nodes. In this work, we investigate a novel paradigm to take advantage of a powerful server model to break through model capacity in FL. By selectively learning from multiple teacher clients and itself, a server model develops in-depth knowledge and transfers its knowledge back to clients in return to boost their respective performance. Our proposed framework achieves superior performance on both server and client models and provides several advantages in a unified framework, including flexibility for heterogeneous client architectures, robustness to poisoning attacks, and communication efficiency between clients and server. By bridging FL effectively with larger server model training, our proposed paradigm paves ways for robust and continual knowledge accumulation from distributed and private data.

0
0
下载
预览

最新论文

Today data is often scattered among billions of resource-constrained edge devices with security and privacy constraints. Federated Learning (FL) has emerged as a viable solution to learn a global model while keeping data private, but the model complexity of FL is impeded by the computation resources of edge nodes. In this work, we investigate a novel paradigm to take advantage of a powerful server model to break through model capacity in FL. By selectively learning from multiple teacher clients and itself, a server model develops in-depth knowledge and transfers its knowledge back to clients in return to boost their respective performance. Our proposed framework achieves superior performance on both server and client models and provides several advantages in a unified framework, including flexibility for heterogeneous client architectures, robustness to poisoning attacks, and communication efficiency between clients and server. By bridging FL effectively with larger server model training, our proposed paradigm paves ways for robust and continual knowledge accumulation from distributed and private data.

0
0
下载
预览
Top