SoK: Private Knowledge Sharing in Distributed Learning
Authors: Yasas Supeksala (Swinburne University of Technology), Thilina Ranbaduge (CSIRO's Data61), Ming Ding (CSIRO's Data61), Dinh C. Nguyen (University of Alabama), Bo Liu (University of Technology Sydney), Caslon Chua (Swinburne University of Technology), Jun Zhang (Swinburne University of Technology)
Volume: 2025
Issue: 4
Pages: 485–506
DOI: https://doi.org/10.56553/popets-2025-0141
Abstract: The rapid advancement of Artificial Intelligence (AI) has transformed various industries, leading to the widespread distribution of AI models and data across intelligent systems. As modern data driven services increasingly integrate distributed knowledge entities, decentralized learning has become a prevalent approach to training AI models. However, this collaborative learning paradigm introduces significant security vulnerabilities and privacy challenges. This paper presents a comprehensive systematic review on private knowledge sharing in distributed learning, analyzing key knowledge components utilized in leading distributed learning architectures. We identify critical vulnerabilities associated with these components and examine defensive strategies to safeguard privacy while mitigating potential adversarial threats. Additionally, we highlight key limitations in knowledge sharing in distributed learning and propose future research directions to enhance security and efficiency in decentralized AI systems.
Keywords: Distributed learning, knowledge sharing, neural networks, artificial intelligence, knowledge components
Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.
