This paper proposes a novel watchdog privatization scheme by generalizing local information privacy (LIP) to enhance data utility. To protect the sensitive features $S$ correlated with some useful data $X$, LIP restricts the lift, the ratio of the posterior belief to the prior on $S$ after and before accessing $X$. For each $x$, both maximum and minimum lift over sensitive features are measures of the privacy risk of publishing this symbol and should be restricted for the privacy-preserving purpose. Previous works enforce the same bound for both max-lift and min-lift. However, empirical observations show that the min-lift is usually much smaller than the max-lift. In this work, we generalize the LIP definition to consider the unequal values of max and min lift, i.e., considering different bounds for max-lift and min-lift. This new definition is applied to the watchdog privacy mechanism. We demonstrate that the utility is enhanced under a given privacy constraint on local differential privacy. At the same time, the resulting max-lift is lower and, therefore, tightly restricts other privacy leakages, e.g., mutual information, maximal leakage, and $\alpha$-leakage.
翻译:本文提出一个新的监督性私有化计划,通过将当地信息隐私(LIP)普遍化来提高数据效用。为了保护敏感特征(S$)与一些有用的数据(X$)相关联的敏感特征(S$美元),LIP限制电梯的起重、事后信仰与前先验(US$)的比重(X美元)。对于每x美元,敏感特征的最大和最低起重都是出版这一符号的隐私风险的衡量标准,并且应当为维护隐私的目的加以限制。以前的工作对最大和微升都实施同样的约束。然而,实证观察显示,微升通常比最大起要小得多。在这项工作中,我们普遍LIP定义,以考虑最大和最小升重的不平等值,即考虑最大和最小升重的不同界限。这一新定义适用于监管隐私机制。我们证明,在对地方差异隐私的某种特定隐私限制下,其效用得到了加强。与此同时,所产生的最大升降价较低,因此严格限制其他隐私泄漏,例如相互信息、最高值、最高值、最高值渗漏和美元。