Private Multi-Winner Voting for Machine Learning

Authors: Adam Dziedzic (University of Toronto and Vector Institute), Christopher A. Choquette-Choo (Google Research, Brain Tea), Natalie Dullerud (University of Toronto and Vector Institute), Vinith Suriyakumar (MIT), Ali Shahin Shamsabadi (The Alan Turing Institue), Muhammad Ahmad Kaleem (University of Toronto and Vector Institute), Somesh Jha (University of Wisconsin-Madison), Nicolas Papernot (University of Toronto and Vector Institute), Xiao Wang (Northwestern University)

Volume: 2023
Issue: 1
Pages: 527–555
DOI: https://doi.org/10.56553/popets-2023-0031

Download PDF

Abstract: Private multi-winner voting is the task of revealing k-hot binary vectors satisfying a bounded differential privacy (DP) guarantee. This task has been understudied in machine learning literature despite its prevalence in many domains such as healthcare. We propose three new DP multi-winner mechanisms: Binary, Tau, and Powerset voting. Binary voting operates independently per label through composition. Tau voting bounds votes optimally in their L2 norm for tight data-independent guarantees. Powerset voting operates over the entire binary vector by viewing the possible outcomes as a power set. Our theoretical and empirical analysis shows that Binary voting can be a competitive mechanism on many tasks unless there are strong correlations between labels, in which case Powerset voting outperforms it. We use our mechanisms to enable privacy-preserving multi-label learning in the central setting by extending the canonical single-label technique: PATE. We find that our techniques outperform current state-of-the-art approaches on large, real-world healthcare data and standard multi-label benchmarks. We further enable multi-label confidential and private collaborative (CaPC) learning and show that model performance can be significantly improved in the multi-site setting.

Keywords: private multi-label classification, differential privacy, machine learning, privacy, multi-label classification, Private Aggregation of Teacher Ensembles (PATE)

Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.