We are interested in privatizing an approximate posterior inference algorithm called Expectation Propagation (EP). EP approximates the posterior by iteratively refining approximations to the local likelihoods, and is known to provide better posterior uncertainties than those by variational inference (VI). However, using EP for large-scale datasets imposes a challenge in terms of memory requirements as it needs to maintain each of the local approximates in memory. To overcome this problem, stochastic expectation propagation (SEP) was proposed, which only considers a unique local factor that captures the average effect of each likelihood term to the posterior and refines it in a way analogous to EP. In terms of privacy, SEP is more tractable than EP because at each refining step of a factor, the remaining factors are fixed to the same value and do not depend on other datapoints as in EP, which makes the sensitivity analysis tractable. We provide a theoretical analysis of the privacy-accuracy trade-off in the posterior estimates under differentially private stochastic expectation propagation (DP-SEP). Furthermore, we demonstrate the performance of our DP-SEP algorithm evaluated on both synthetic and real-world datasets in terms of the quality of posterior estimates at different levels of guaranteed privacy.
翻译:我们感兴趣的是将近似远地点推推论算法(PEP)私有化。EP通过对当地可能性的近似近似近似近似近似于后地点,而且众所周知,与变推推(VI)相比,EP提供了更好的后继不确定性。然而,在大型数据集中使用EP在记忆中需要保持当地每种近似值时,在记忆要求方面提出了挑战。为了克服这一问题,建议了随机预期传播(SEP),它只考虑一个独特的当地因素,即每个可能术语对远地点的平均影响,并以类似EP的方式加以改进。 在隐私方面,SEPEP比EP更能提供更好的后继不确定因素。因为在每个精细的阶段,其余因素都固定在价值上,而并不取决于EP中的其他数据点,这就使得敏感度分析具有可感性。我们从理论上分析了根据不同私人对远地点对远地点的远地点进行预测后期估算(DP-SEP),我们用不同保的合成系统质量数据水平展示了我们所评估的实绩。此外,在对不同合成-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-