Communication Efficient Differentially Private Federated Learning Using Second Order Information

Authors: Mounssif Krouka (University of Oulu), Antti Koskela (Nokia Bell Labs), Tejas Kulkarni (Nokia Bell Labs)

Volume: 2025
Issue: 1
Pages: 584–612
DOI: https://doi.org/10.56553/popets-2025-0032

Download PDF

Abstract: Training machine learning models with differential privacy (DP) is commonly done using first-order methods such as DP-SGD. In the non-private setting, second-order methods try to mitigate the slow convergence of first-order methods. The DP methods that use second-order information still provide faster convergence, however the existing methods cannot be easily turned into federated learning (FL) algorithms without an excessive communication cost required by the exchange of the Hessian or feature covariance information between the nodes and the server. In this paper we propose DP-FedNew, a DP method for FL that uses second-order information and results in per-iteration communication cost similar to first-order methods such as DP Federated Averaging.

Keywords: Differential Privacy, Federated Learning, Communication-Efficiency, ADMM, Newton’s Method, Second-Order Optimization Methods, Distributed Optimization

Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.