Differential privacy is a privacy measure based on the difficulty of discriminating between similar input data. In differential privacy analysis, similar data usually implies that their distance does not exceed a predetermined threshold. It, consequently, does not take into account the difficulty of distinguishing data sets that are far apart, which often contain highly private information. This problem has been pointed out in the research on differential privacy for static data, and Bayesian differential privacy has been proposed, which provides a privacy protection level even for outlier data by utilizing the prior distribution of the data. In this study, we introduce this Bayesian differential privacy to dynamical systems, and provide privacy guarantees for distant input data pairs and reveal its fundamental property. For example, we design a mechanism that satisfies the desired level of privacy protection, which characterizes the trade-off between privacy and information utility.
翻译:不同隐私是一种隐私措施,其依据是难以区分类似输入数据。在差异隐私分析中,类似数据通常意味着其距离不超过预定的门槛,因此,它没有考虑到区分相距甚远的数据集的困难,这些数据集往往包含非常私人的信息。这个问题在关于静态数据不同隐私的研究中已经指出,而且提出了巴耶斯差异隐私,通过利用先前的数据分布,为甚至外部数据提供了隐私保护水平。在本研究中,我们向动态系统介绍巴耶西亚差异隐私,为遥远的输入数据配对提供隐私保障,并披露其基本属性。例如,我们设计了一种机制,满足隐私保护的理想水平,这是隐私和信息效用之间的权衡。