Differential Privacy at Risk: Bridging Randomness and Privacy Budget
Authors: Ashish Dandekar (DI ENS, ENS, CNRS, PSL University & Inria, Paris, France), Debabrota Basu (Dept. of Computer Sci. and Engg., Chalmers University of Technology, Göteborg, Sweden), Stéphane Bressan (National University of Singapore, Singapore)
Volume: 2021
Issue: 1
Pages: 64–84
DOI: https://doi.org/10.2478/popets-2021-0005
Abstract: The calibration of noise for a privacypreserving mechanism depends on the sensitivity of the query and the prescribed privacy level. A data steward must make the non-trivial choice of a privacy level that balances the requirements of users and the monetary constraints of the business entity. Firstly, we analyse roles of the sources of randomness, namely the explicit randomness induced by the noise distribution and the implicit randomness induced by the data-generation distribution, that are involved in the design of a privacy-preserving mechanism. The finer analysis enables us to provide stronger privacy guarantees with quantifiable risks. Thus, we propose privacy at risk that is a probabilistic calibration of privacypreserving mechanisms. We provide a composition theorem that leverages privacy at risk. We instantiate the probabilistic calibration for the Laplace mechanism by providing analytical results. Secondly, we propose a cost model that bridges the gap between the privacy level and the compensation budget estimated by a GDPR compliant business entity. The convexity of the proposed cost model leads to a unique fine-tuning of privacy level that minimises the compensation budget. We show its effectiveness by illustrating a realistic scenario that avoids overestimation of the compensation budget by using privacy at risk for the Laplace mechanism. We quantitatively show that composition using the cost optimal privacy at risk provides stronger privacy guarantee than the classical advanced composition. Although the illustration is specific to the chosen cost model, it naturally extends to any convex cost model. We also provide realistic illustrations of how a data steward uses privacy at risk to balance the tradeoff between utility and privacy.
Keywords: Differential privacy, cost model, Laplace mechanism
Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 license.