Communication Efficient Secure and Private Multi-Party Deep Learning

Authors: Sankha Das (Microsoft Research India), Sayak Ray Chowdhury (Microsoft Research India), Nishanth Chandran (Microsoft Research India), Divya Gupta (Microsoft Research India), Satya Lokam (Microsoft Research India), Rahul Sharma (Microsoft Research India)

Volume: 2025
Issue: 1
Pages: 169–183
DOI: https://doi.org/10.56553/popets-2025-0010

Download PDF

Abstract: Distributed training that enables multiple parties to jointly train a model on their respective datasets is a promising approach to address the challenges of large volumes of diverse data for training modern machine learning models. However, this approach immedi- ately raises security and privacy concerns; both about each party wishing to protect its data from other parties during training and preventing leakage of private information from the model after training through various inference attacks. In this paper, we ad- dress both these concerns simultaneously by designing efficient Differentially Private, secure Multiparty Computation (DP-MPC) protocols for jointly training a model on data distributed among multiple parties. Our DP-MPC protocol in the two-party setting is 56-794× more communication-efficient and 16-182× faster than previous such protocols. Conceptually, our work simplifies and improves on previous attempts to combine techniques from secure multiparty computation and differential privacy, especially in the context of ML training.

Keywords: Differential Privacy, Secure Multi-Party Computation, Secure and Private Deep Learning, Discrete Gaussian Mechanism

Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.