Decentralized learning has gained great popularity to improve learning efficiency and preserve data privacy. Each computing node makes equal contribution to collaboratively learn a Deep Learning model. The elimination of centralized Parameter Servers (PS) can effectively address many issues such as privacy, performance bottleneck and single-point-failure. However, how to achieve Byzantine Fault Tolerance in decentralized learning systems is rarely explored, although this problem has been extensively studied in centralized systems. In this paper, we present an in-depth study towards the Byzantine resilience of decentralized learning systems with two contributions. First, from the adversarial perspective, we theoretically illustrate that Byzantine attacks are more dangerous and feasible in decentralized learning systems: even one malicious participant can arbitrarily alter the models of other participants by sending carefully crafted updates to its neighbors. Second, from the defense perspective, we propose UBAR, a novel algorithm to enhance decentralized learning with Byzantine Fault Tolerance. Specifically, UBAR provides a Uniform Byzantine-resilient Aggregation Rule for benign nodes to select the useful parameter updates and filter out the malicious ones in each training iteration. It guarantees that each benign node in a decentralized system can train a correct model under very strong Byzantine attacks with an arbitrary number of faulty nodes. We conduct extensive experiments on standard image classification tasks and the results indicate that UBAR can effectively defeat both simple and sophisticated Byzantine attacks with higher performance efficiency than existing solutions.


翻译:在提高学习效率和保护数据隐私方面,分散化的学习获得非常受欢迎。每个计算节点都对协作学习深学习模式做出同等贡献。消除集中的参数服务器(PS)可以有效解决隐私、性能瓶颈和单点故障等许多问题。然而,如何在分散化的学习系统中实现拜占廷断层容忍却很少被探索,尽管这个问题已经在集中化的系统中进行了广泛的研究。在本文件中,我们用两种贡献对分散化的学习系统的拜占庭弹性规则进行了深入研究。首先,从对抗的角度,我们理论上表明,在分散化的学习系统中,拜占庭袭击更加危险和可行:即使是一个恶意的参与者也可以任意改变其他参与者的模式,向其邻居发送精心设计的最新信息。第二,从国防角度,我们建议采用乌占庭断层(UBAR)的新算法,以加强与Byzantine Fault容忍的分散化学习。具体来说,UBAR提供一种统一的Byzant-redient Agregistration 规则,用于选择有用的参数更新,并过滤每一起在分散化式攻击中的恶性模型。通过分级(Bechnical),它保证每个不透明化的操作都能够纠正。

0
下载
关闭预览

相关内容

iOS 8 提供的应用间和应用跟系统的功能交互特性。
  • Today (iOS and OS X): widgets for the Today view of Notification Center
  • Share (iOS and OS X): post content to web services or share content with others
  • Actions (iOS and OS X): app extensions to view or manipulate inside another app
  • Photo Editing (iOS): edit a photo or video in Apple's Photos app with extensions from a third-party apps
  • Finder Sync (OS X): remote file storage in the Finder with support for Finder content annotation
  • Storage Provider (iOS): an interface between files inside an app and other apps on a user's device
  • Custom Keyboard (iOS): system-wide alternative keyboards

Source: iOS 8 Extensions: Apple’s Plan for a Powerful App Ecosystem
专知会员服务
44+阅读 · 2020年10月31日
神经常微分方程教程,50页ppt,A brief tutorial on Neural ODEs
专知会员服务
70+阅读 · 2020年8月2日
【Google】平滑对抗训练,Smooth Adversarial Training
专知会员服务
46+阅读 · 2020年7月4日
Fariz Darari简明《博弈论Game Theory》介绍,35页ppt
专知会员服务
106+阅读 · 2020年5月15日
Call for Participation: Shared Tasks in NLPCC 2019
中国计算机学会
5+阅读 · 2019年3月22日
已删除
将门创投
4+阅读 · 2018年6月26日
分布式TensorFlow入门指南
机器学习研究会
4+阅读 · 2017年11月28日
【学习】Hierarchical Softmax
机器学习研究会
4+阅读 · 2017年8月6日
Arxiv
0+阅读 · 2021年12月17日
VIP会员
相关资讯
Call for Participation: Shared Tasks in NLPCC 2019
中国计算机学会
5+阅读 · 2019年3月22日
已删除
将门创投
4+阅读 · 2018年6月26日
分布式TensorFlow入门指南
机器学习研究会
4+阅读 · 2017年11月28日
【学习】Hierarchical Softmax
机器学习研究会
4+阅读 · 2017年8月6日
Top
微信扫码咨询专知VIP会员