We investigate the local differential privacy (LDP) guarantees of a randomized privacy mechanism via its contraction properties. We first show that LDP constraints can be equivalently cast in terms of the contraction coefficient of the $E_\gamma$-divergence. We then use this equivalent formula to express LDP guarantees of privacy mechanisms in terms of contraction coefficients of arbitrary $f$-divergences. When combined with standard estimation-theoretic tools (such as Le Cam's and Fano's converse methods), this result allows us to study the trade-off between privacy and utility in several testing and minimax and Bayesian estimation problems.
翻译:我们通过收缩性能调查当地差异隐私保障(LDP)对随机隐私机制的保障。我们首先表明,LDP的制约可以等同于E ⁇ gamma$-diverence的收缩系数。然后我们用这个等值公式来表示LDP对私隐机制的保障,即任意的f$-diverences的收缩系数。如果结合标准估计-理论工具(如Le Cam's和Fano的反面方法),这一结果使我们能够研究隐私与实用性之间的权衡(在几个测试中以及微型和海湾的估计问题中)。