We present practical aspects of implementing a pseudo posterior synthesizer for microdata dissemination under a new re-weighting strategy for utility maximization of released synthetic data. Our re-weighting strategy applies to any vector-weighting approach under which a vector of observation-indexed weight are used to downweight likelihood contributions for high disclosure risk records. We demonstrate our method on two different vector-weighted schemes that target high-risk records by exponentiating each of their likelihood contributions with a record-indexed weight, $\alpha_i \in [0,1]$ for record $i \in (1,\ldots,n)$. We compute the overall Lipschitz bound, $\Delta_{\boldsymbol{\alpha},\mathbf{x}}$, for the database $\mathbf{x}$, under each vector-weighted scheme where a local $\epsilon_{x} = 2\Delta_{\boldsymbol{\alpha},\mathbf{x}}$. Our new method for constructing record-indexed downeighting maximizes the data utility under any privacy budget for the vector-weighted synthesizers by adjusting the by-record weights, $(\alpha_{i})_{i = 1}^{n}$, such that their individual Lipschitz bounds, $\Delta_{\boldsymbol{\alpha},x_{i}}$, approach the bound for the entire database, $\Delta_{\boldsymbol{\alpha},\mathbf{x}}$. Our method asymptotically (as sample size grows) achieves an $(\epsilon = 2 \Delta_{\boldsymbol{\alpha}})-$ differential privacy (DP) guarantee, globally, over the space of databases, $\mathbf{x}\in\mathcal{X}$. We illustrate our methods using simulated count data with and without over-dispersion-induced skewness and compare the results to a scalar-weighted synthesizer under the Exponential Mechanism. We demonstrate our asymptotic DP result in a simulation study. We apply our methods to a sample of the Survey of Doctorate Recipients.
翻译:我们展示了两种不同的矢量加权计划的方法, 其目标都是高风险记录, 使用创纪录的重量, $\alpha_ i\ in [0,1]美元, 用于记录 $ i\ intemplicational dispolational data。 我们的重新加权战略适用于任何矢量加权方法, 根据该方法, 使用观测- 指数重量的矢量递增概率贡献, 用于高披露风险记录。 根据两种不同的矢量加权计划, 该方法, 通过记录- 指数重量重量重量的重量, 美元( ========xxxxxxxxx) 用于记录( ====xxxxxxxxxxxxxxxxx) 记录(xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx