Differential privacy is often applied with a privacy parameter that is larger than the theory suggests is ideal; various informal justifications for tolerating large privacy parameters have been proposed. In this work, we consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis. In this framework, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person (i.e., all the attributes).
翻译:不同隐私通常使用隐私参数,其范围比理论显示的要大;提出了容忍大型隐私参数的各种非正式理由;在这项工作中,我们考虑了部分差异隐私(DP),允许按每个属性对隐私保障进行量化。在此框架内,我们研究几项基本数据分析和学习任务,设计每个属性隐私参数小于个人整个记录(即所有属性)最佳可能隐私参数的算法。