This paper develops an efficient distributed inference algorithm, which is robust against a moderate fraction of Byzantine nodes, namely arbitrary and possibly adversarial machines in a distributed learning system. In robust statistics, the median-of-means (MOM) has been a popular approach to hedge against Byzantine failures due to its ease of implementation and computational efficiency. However, the MOM estimator has the shortcoming in terms of statistical efficiency. The first main contribution of the paper is to propose a variance reduced median-of-means (VRMOM) estimator, which improves the statistical efficiency over the vanilla MOM estimator and is computationally as efficient as the MOM. Based on the proposed VRMOM estimator, we develop a general distributed inference algorithm that is robust against Byzantine failures. Theoretically, our distributed algorithm achieves a fast convergence rate with only a constant number of rounds of communications. We also provide the asymptotic normality result for the purpose of statistical inference. To the best of our knowledge, this is the first normality result in the setting of Byzantine-robust distributed learning. The simulation results are also presented to illustrate the effectiveness of our method.
翻译:本文发展了一种高效分布式的推算算法,这种算法对拜占庭节点的温和部分具有很强的作用,即:在分布式学习系统中的任意和可能的对抗机器。在可靠的统计数据中,中位值(MOM)是防范拜占庭失误的流行方法,因为其实施和计算效率较容易。然而,从统计效率的角度来看,MOM估计法有缺陷。文件的第一个主要贡献是提出一个差异减少中位值(VRMOM)估测器(VRMOM),它提高了香草中位值的统计效率,而且计算效率与MOM一样。根据拟议的VRMOM估计法,我们开发了一种普遍分布式推算法,它对于拜占庭节节节的失败具有很强的力度。理论上,我们分布式算法的快速趋同率,只有固定的几轮通信。我们还为统计推断的目的提供了无温常态的正常性结果。我们最了解的是,这是以最佳方式进行模拟的结果。这是我们模拟方法的正常结果。