Compact: Approximating Complex Activation Functions for Secure Computation

Authors: Mazharul Islam (University of Wisconsin - Madison), Sunpreet S. Arora (Visa Research), Rahul Chatterjee (University of Wisconsin - Madison), Peter Rindal (Visa Research), Maliheh Shirvanian (Netflix)

Volume: 2024
Issue: 3
Pages: 25–41
DOI: https://doi.org/10.56553/popets-2024-0065

Download PDF

Abstract: Secure multi-party computation (MPC) techniques can be used to provide data privacy when users query deep neural network (DNN) models hosted on a public cloud. State-of-the-art MPC techniques can be directly leveraged for DNN models that use simple activation functions (AFs) such as ReLU. However, these techniques are ineffective and/or inefficient for the complex and highly non-linear AFs used in cutting-edge DNN models. We present Compact, which produces piece-wise polynomial approximations of complex AFs to enable their efficient use with state-of-the-art MPC techniques. Compact neither requires nor imposes any restriction on model training and results in near-identical model accuracy. To achieve this, we design Compact with input density awareness, and use an application specific simulated annealing type optimization to generate computationally more efficient approximations of complex AFs. We extensively evaluate Compact on four different machine-learning tasks with DNN architectures that use popular complex AFs silu, gelu, and mish. Our experimental results show that Compact incurs negligible accuracy loss while being 2x-5x computationally more efficient than state-of-the-art approaches for DNN models with large number of hidden layers. Our work accelerates easy adoption of MPC techniques to provide user data privacy even when the queried DNN models consist of a number of hidden layers, and trained over complex AFs.

Keywords: activation functions, neural networks, secure inference, approximation

Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.