Governments and industries have widely adopted differential privacy as a measure to protect users' sensitive data, creating the need for new implementations of differentially private algorithms. In order to properly test and audit these algorithms, a suite of tools for testing the property of differential privacy is needed. In this work we expand this testing suite and introduce R\'enyiTester, an algorithm that can verify if a mechanism is R\'enyi differentially private. Our algorithm computes computes a lower bound of the R\'enyi divergence between the distributions of a mechanism on neighboring datasets, only requiring black-box access to samples from the audited mechanism. We test this approach on a variety of pure and R\'enyi differentially private mechanisms with diverse output spaces and show that R\'enyiTester detects bugs in mechanisms' implementations and design flaws. While detecting that a general mechanism is differentially private is known to be NP hard, we empirically show that tools like R\'enyiTester provide a way for researchers and engineers to decrease the risk of deploying mechanisms that expose users' privacy.
翻译:暂无翻译