We consider how to privately share the personalized privacy losses incurred by objective perturbation, using per-instance differential privacy (pDP). Standard differential privacy (DP) gives us a worst-case bound that might be orders of magnitude larger than the privacy loss to a particular individual relative to a fixed dataset. The pDP framework provides a more fine-grained analysis of the privacy guarantee to a target individual, but the per-instance privacy loss itself might be a function of sensitive data. In this paper, we analyze the per-instance privacy loss of releasing a private empirical risk minimizer learned via objective perturbation, and propose a group of methods to privately and accurately publish the pDP losses at little to no additional privacy cost.
翻译:我们考虑如何利用每份入息差异隐私(pDP)来私下分担客观干扰造成的个人隐私损失。 标准差分隐私(DP)为我们设定了最坏的界限,其规模可能大于某个特定个人相对于固定数据集的隐私损失。 PDP框架对目标个人的隐私保障进行了更精细的分析,但每份入息隐私损失本身可能是敏感数据的函数。 在本文中,我们分析了通过客观扰动释放私人经验风险最小化者而导致的隐私损失,并提出了一系列方法,以便私下准确地公布PDP损失,而很少甚至没有额外的隐私费用。