SIGMA: Secure GPT Inference with Function Secret Sharing

Authors: Kanav Gupta (University of Maryland, College Park), Neha Jawalkar (Indian Institute of Science), Ananta Mukherjee (Microsoft Research), Nishanth Chandran (Microsoft Research), Divya Gupta (Microsoft Research), Ashish Panwar (Microsoft Research), Rahul Sharma (Microsoft Research)

Volume: 2024
Issue: 4
Pages: 61–79
DOI: https://doi.org/10.56553/popets-2024-0107

Artifact: Available

Download PDF

Abstract: Secure 2-party computation (2PC) enables secure inference that offers protection for both proprietary machine learning (ML) models and sensitive inputs to them. However, the existing secure inference solutions suffer from high latency and communication overheads, particularly for transformers. Function secret sharing (FSS) is a recent paradigm for obtaining efficient 2PC protocols with a preprocessing phase. We provide Sigma, the first end-to-end system for secure transformer inference based on FSS. By constructing new FSS-based protocols for complex machine learning functionalities, such as Softmax, GeLU and SiLU, and also accelerating their computation on GPUs, Sigma improves the latency of secure inference of transformers by 11 - 19x over the state-of-the-art that uses preprocessing and GPUs. We present the first secure inference of generative pre-trained transformer (GPT) models. In particular, Sigma executes Meta's Llama2 (available on HuggingFace) with 13 billion parameters in 44 seconds and GPT2 in 1.6 seconds

Keywords: Function Secret Sharing, MPC, secure inference, transformers, GPT

Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.