Secure Softmax/Sigmoid for Machine-learning Computation

Published: 01 Jan 2023, Last Modified: 28 Sept 2024ACSAC 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Softmax and sigmoid, composing exponential functions (ex) and division (1/x), are activation functions often required in training. Secure computation on non-linear, unbounded 1/x and ex is already challenging, let alone their composition. Prior works aim to compute softmax by its exact formula via iteration (CrypTen, NeurIPS ’21) or with ASM approximation (Falcon, PoPETS ’21). They fall short in efficiency and/or accuracy. For sigmoid, existing solutions such as ABY2.0 (Usenix Security ’21) compute it via piecewise functions, incurring logarithmic communication rounds.We study a rarely-explored approach to secure computation using ordinary differential equations and Fourier series for numerical approximation of rational/trigonometric polynomials over composition rings. Our results include 1) the first constant-round protocol for softmax and 2) the first 1-round error-bounded protocol for approximating sigmoid. They reduce communication by and, respectively, shortening the private training process of state-of-the-art frameworks or platforms, namely, CryptGPU (S&P ’21), Piranha (Usenix Security ’22), and quantized training from MP-SPDZ (ICML ’22), while maintaining competitive accuracy.
Loading