We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between $PK$ and $QK$ output distributions of an $\epsilon$-LDP mechanism $K$ in terms of a divergence between the corresponding input distributions $P$ and $Q$, respectively. Our first main technical result presents a sharp upper bound on the $\chi^2$-divergence $\chi^2(PK\|QK)$ in terms of $\chi^2(P\|Q)$ and $\epsilon$. We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on $\chi^2(PK\|QK)$ in terms of total variation distance $TV(P, Q)$ and $\epsilon$. We then utilize these bounds to establish locally private versions of the Cram\'er-Rao bound, Le Cam's, Assouad's, and the mutual information methods, which are powerful tools for bounding minimax estimation risks. These results are shown to lead to better privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.
翻译:更具体地说,我们对本地差异私人机制的收缩性质进行了调查。我们从美元和美元之间的差异中得出了严格的上限。我们从相应的投入分配和美元(分别为P美元和Q美元)之间的差异中得出了美元与美元之间的差异。我们的第一个主要技术结果显示,美元(P2美元)和美元(PKK)的收缩幅度在美元(美元)和美元(Epsilon美元)的收缩方面有很大的上限。我们还表明,包括KL-diverence和平方的Hellinger距离在内的大宗差异,也得出了同样的结果。第二个主要技术结果显示,美元(P2美元)和美元(PKKK)的总变频距离(美元)和美元(P2美元)的上限。我们然后利用这些界限来建立Cram\er-Raoimates、Le Cam's、Assouds和相互信息直径估算的本地版本的本地版本。这些数据估算方法比模型的模型估算的模型分析要好得多,这些模型分析显示,这些模型的模型分析显示,这些模型的模型的模型分析结果比模型的模型的模型分析要好。