We analyse the privacy leakage of noisy stochastic gradient descent by modeling R\'enyi divergence dynamics with Langevin diffusions. Inspired by recent work on non-stochastic algorithms, we derive similar desirable properties in the stochastic setting. In particular, we prove that the privacy loss converges exponentially fast for smooth and strongly convex objectives under constant step size, which is a significant improvement over previous DP-SGD analyses. We also extend our analysis to arbitrary sequences of varying step sizes and derive new utility bounds. Last, we propose an implementation and our experiments show the practical utility of our approach compared to classical DP-SGD libraries.
翻译:我们通过模拟R\'enyi与Langevin扩散的分化动态,分析噪音的随机性梯度下降的隐私渗漏。受最近关于非随机性算法的研究的启发,我们在随机性环境中产生了类似的可取特性。特别是,我们证明,与以往的DP-SGD分析相比,与以往的DP-SGD分析相比,静态和强烈凝固的目标在固定的步数大小下,私隐性损失迅速成指数状地汇合。我们还将我们的分析扩大到不同步数大小的任意序列,并得出新的工具界限。最后,我们提议实施和实验显示我们的方法与传统的DP-SGD图书馆相比的实际效用。