Differential privacy (DP) enables private data analysis but is hard to use in practice. For data controllers who decide what output to release, choosing the amount of noise to add to the output is a non-trivial task because of the difficulty of interpreting the privacy parameter $\epsilon$. For data analysts who submit queries, it is hard to understand the impact of the noise introduced by DP on their tasks. To address these two challenges: 1) we define a privacy risk indicator that indicates the impact of choosing $\epsilon$ on individuals' privacy and use that to design an algorithm to choose $\epsilon$ and release output based on controllers' privacy preferences; 2) we introduce a utility signaling protocol that helps analysts interpret the impact of DP on their downstream tasks. We implement the algorithm and the protocol inside a new platform built on top of a data escrow, which allows controllers to control dataflows while maintaining high performance. We demonstrate our contributions through an IRB-approved user study, extensive experimental evaluations, and comparison with other DP platforms. All in all, our work contributes to making DP easier to use by lowering adoption barriers.
翻译:暂无翻译