Privacy Games: Optimal User-Centric Data Obfuscation

Authors: Reza Shokri (University of Texas at Austin)

Volume: 2015
Issue: 2
Pages: 299–315
DOI: https://doi.org/10.1515/popets-2015-0024

Download PDF

Abstract: Consider users who share their data (e.g., location) with an untrusted service provider to obtain a personalized (e.g., location-based) service. Data obfuscation is a prevalent user-centric approach to protecting users’ privacy in such systems: the untrusted entity only receives a noisy version of user’s data. Perturbing data before sharing it, however, comes at the price of the users’ utility (service quality) experience which is an inseparable design factor of obfuscation mechanisms. The entanglement of the utility loss and the privacy guarantee, in addition to the lack of a comprehensive notion of privacy, have led to the design of obfuscation mechanisms that are either suboptimal in terms of their utility loss, or ignore the user’s information leakage in the past, or are limited to very specific notions of privacy which e.g., do not protect against adaptive inference attacks or the adversary with arbitrary background knowledge. In this paper, we design user-centric obfuscation mechanisms that impose the minimum utility loss for guaranteeing user’s privacy. We optimize utility subject to a joint guarantee of differential privacy (indistinguishability) and distortion privacy (inference error). This double shield of protection limits the information leakage through obfuscation mechanism as well as the posterior inference. We show that the privacy achieved through joint differential-distortion mechanisms against optimal attacks is as large as the maximum privacy that can be achieved by either of these mechanisms separately. Their utility cost is also not larger than what either of the differential or distortion mechanisms imposes. We model the optimization problem as a leader-follower game between the designer of obfuscation mechanism and the potential adversary, and design adaptive mechanisms that anticipate and protect against optimal inference algorithms. Thus, the obfuscation mechanism is optimal against any inference algorithm.

Keywords: Data Privacy, Obfuscation, Utility, Differential Privacy, Distortion Privacy, Inference Attack, Prior Knowledge, Optimization, Game Theory

Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution-NonCommercial-NoDerivs license.