ezDPS: An Efficient and Zero-Knowledge Machine Learning Inference Pipeline
Authors: Haodi Wang (Beijing Normal University / Virginia Tech), Thang Hoang (Virginia Tech)
Volume: 2023
Issue: 2
Pages: 430–448
DOI: https://doi.org/10.56553/popets-2023-0061
Abstract: Machine Learning as a service (MLaaS) permits resource-limited clients to access powerful data analytics services ubiquitously. Despite its merits, MLaaS poses significant concerns regarding the integrity of delegated computation and the privacy of the server’s model parameters. To address this issue, Zhang et al. (CCS'20) initiated the study of zero-knowledge Machine Learning (zkML). Few zkML schemes have been proposed afterward; however, they focus on sole ML classification algorithms that may not offer satisfactory accuracy or require large-scale training data and model parameters, which may not be desirable for some applications. We propose ezDPS, a new efficient and zero-knowledge ML inference scheme. Unlike prior works, ezDPS is a zkML pipeline in which the data is processed in multiple stages for high accuracy. Each stage of ezDPS is harnessed with an established ML algorithm that is shown to be effective in various applications, including Discrete Wavelet Transformation, Principal Components Analysis, and Support Vector Machine. We design new gadgets to prove ML operations effectively. We fully implemented ezDPS and assessed its performance on real datasets. Experimental results showed that ezDPS achieves one-to-three orders of magnitude more efficient than the generic circuit-based approach in all metrics while maintaining more desirable accuracy than single ML classification approaches.
Keywords: Verifiable Machine Learning, Zero-Knowledge Proofs, Principle Component Analysis (PCA), Support Vector Machine (SVM)
Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.