Abstract: Complex math functions, such as exponential and tanh, are widely applied in machine learning inference tasks like recurrent neural networks (RNNs). Even though a few works have provided secure implementations of these functions, they still suffer from serious performance bottlenecks, leaving efficiency gaps in practice. To approach this issue, we propose SecMath, an efficient 2-party cryptographic framework for complex math functions. Specifically, SecMath contributes novel communication-efficient protocols for secure exponential, sigmoid and tanh operations. These protocols utilize an advanced underlying primitive, silent oblivious transfer, and employ customized optimizations including lookup table techniques to further improve performance. Extensive evaluations show that our new constructions outperform the counterparts in SIRNN (IEEE S&P'21) by a large margin in terms of both communication and computation overhead. For example, the sigmoid operation of SecMath costs 4.15KB communication and less than 0.2 millisecond, which improves SIRNN up to 7.6× in communication and 2.4× in runtime.
Loading