Federated learning (FL) refers to a distributed machine learning framework involving learning from several decentralized edge clients without sharing local dataset. This distributed strategy prevents data leakage and enables on-device training as it updates the global model based on the local model updates. Despite offering several advantages, including data privacy and scalability, FL poses challenges such as statistical and system heterogeneity of data in federated networks, communication bottlenecks, privacy and security issues. This survey contains a systematic summarization of previous work, studies, and experiments on FL and presents a list of possibilities for FL across a range of applications and use cases. Other than that, various challenges of implementing FL and promising directions revolving around the corresponding challenges are provided.
翻译:联邦学习(FL)指的是一种分布式机器学习框架,涉及从多个分散的边缘客户端进行学习,而不共享本地数据集。这种分布式策略不仅可以防止数据泄露,还可以在设备上进行训练,因为它基于本地模型更新更新全局模型。尽管提供了许多优势,包括数据隐私和可扩展性,但FL面临诸如联邦网络中的统计和系统异质性、通信瓶颈、隐私和安全问题等挑战。本综述对之前的研究、研究和实验进行了系统总结,并提供了在各种应用和用例中进行FL的可能性的列表。除此之外,提供了关于实施FL的各种挑战和围绕相应挑战的有前途的方向。