Recently, local peer topology has been shown to influence the overall convergence of decentralized learning (DL) graphs in the presence of data heterogeneity. In this paper, we demonstrate the advantages of constructing a proxy-based locally heterogeneous DL topology to enhance convergence and maintain data privacy. In particular, we propose a novel peer clumping strategy to efficiently cluster peers before arranging them in a final training graph. By showing how locally heterogeneous graphs outperform locally homogeneous graphs of similar size and from the same global data distribution, we present a strong case for topological pre-processing. Moreover, we demonstrate the scalability of our approach by showing how the proposed topological pre-processing overhead remains small in large graphs while the performance gains get even more pronounced. Furthermore, we show the robustness of our approach in the presence of network partitions.
翻译:最近,地方同侪表层学在数据差异性的情况下对分散化学习图(DL)的总体趋同产生了影响。在本文中,我们展示了建立基于代理的本地差异化DL表层以加强趋同和维护数据隐私的优势。特别是,我们提出了一种新的同侪群集战略,在将同侪群集到最后的培训图中之前对他们进行安排。通过显示地方差异性图如何优于类似大小和同一全球数据分布的本地同质图,我们提出了一个强有力的方法,用于处理地形学预处理。此外,我们展示了我们的方法的可扩展性,展示了拟议中的表层学前处理间接费用如何在大图中保持小,而绩效收益则变得更加明显。此外,我们展示了在存在网络分割的情况下我们的方法的稳健性。