Many modern machine learning algorithms are composed of simple private algorithms; thus, an increasingly important problem is to efficiently compute the overall privacy loss under composition. In this study, we introduce the Edgeworth Accountant, an analytical approach to composing differential privacy guarantees of private algorithms. The Edgeworth Accountant starts by losslessly tracking the privacy loss under composition using the $f$-differential privacy framework, which allows us to express the privacy guarantees using privacy-loss log-likelihood ratios (PLLRs). As the name suggests, this accountant next uses the Edgeworth expansion to the upper and lower bounds the probability distribution of the sum of the PLLRs. Moreover, by relying on a technique for approximating complex distributions using simple ones, we demonstrate that the Edgeworth Accountant can be applied to the composition of any noise-addition mechanism. Owing to certain appealing features of the Edgeworth expansion, the $(\epsilon, \delta)$-differential privacy bounds offered by this accountant are non-asymptotic, with essentially no extra computational cost, as opposed to the prior approaches in, wherein the running times increase with the number of compositions. Finally, we demonstrate that our upper and lower $(\epsilon, \delta)$-differential privacy bounds are tight in federated analytics and certain regimes of training private deep learning models.
翻译:许多现代机器学习算法是由简单的私人算法构成的;因此,一个日益重要的问题是,如何有效地计算组成中的总体隐私损失。在本研究中,我们引入了Edgeworth会计家,这是对私人算法有差别的隐私保障的一种分析方法。Edgeworth会计家从无损地跟踪组成中的隐私损失开始,使用美元差异的隐私框架,允许我们使用隐私损失日志相似比率(PLLRs)来表达隐私保障。正如名称所示,这位会计师接下来使用Edgeworth扩展至PLR总和的上限和下限的可能性分布。此外,我们依靠一种技术来利用简单算法进行相近的复杂分配。我们证明Edgeworth会计家可以运用任何噪音添加机制的构成进行无损无损无损追踪。由于Edgeworth扩展的某些吸引性特征,$(cepsilon, deltatata) 美元差异隐私权约束下,这位会计师提供的Edgeworth 范围是非约束的,基本上没有更深的计算成本,而没有更深的计算成本。此外的计算成本。此外,与我们之前的排序中的某些隐私模式相比,最后展示了我们更低的学习模式。