Models Matter: Setting Accurate Privacy Expectations for Local and Central Differential Privacy

Authors: Mary Anne Smart (Purdue University), Priyanka Nanayakkara (Harvard University), Rachel Cummings (Columbia University), Gabriel Kaptchuk (University of Maryland, College Park), Elissa Redmiles (Georgetown University)

Volume: 2025
Issue: 4
Pages: 653–678
DOI: https://doi.org/10.56553/popets-2025-0150

Download PDF

Abstract: Differential privacy is a popular privacy-enhancing technology that has been deployed both by industry and government agencies. Unfortunately, existing explanations of differential privacy fail to set accurate privacy expectations for data subjects, which depend on the choice of deployment model. We design and evaluate new explanations of differential privacy for the local and central models, drawing inspiration from prior work explaining other privacy-enhancing technologies such as encryption. We reflect on the challenges in evaluating explanations and on the tradeoffs between qualitative and quantitative evaluation strategies. These reflections offer guidance for other researchers seeking to design and evaluate explanations of privacy-enhancing technologies.

Keywords: differential privacy, human factors, usable security and privacy

Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.