We study the generation of dependent random numbers in a distributed fashion in order to enable privatized distributed learning by networked agents. We propose a method that we refer to as local graph-homomorphic processing; it relies on the construction of particular noises over the edges to ensure a certain level of differential privacy. We show that the added noise does not affect the performance of the learned model. This is a significant improvement to previous works on differential privacy for distributed algorithms, where the noise was added in a less structured manner without respecting the graph topology and has often led to performance deterioration. We illustrate the theoretical results by considering a linear regression problem over a network of agents.
翻译:我们以分布式方式研究依赖随机数字的生成,以便使网络化的代理商能够将分布式学习私有化。我们建议一种我们称之为本地图形-成形处理的方法;它依靠在边缘上制造特定噪音,以确保某种程度的隐私差异。我们表明,增加的噪音并不影响所学模型的性能。这大大改进了以前关于分配式算法差异隐私权的工作,即以结构不严谨的方式添加噪音,不尊重图表的形态,往往导致性能恶化。我们通过考虑一个代理商网络的线性回归问题来说明理论结果。