Tabula: Efficiently Computing Nonlinear Activation Functions for Private Neural Network InferenceDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: private neural network inference, privacy, security, performance
Abstract: Multiparty computation approaches to private neural network inference require significant communication between server and client, incur tremendous runtime penalties, and cost massive storage overheads. The primary source of these expenses is garbled circuits operations for nonlinear activation functions (typically ReLU), which require on the order of kilobytes of data transfer for each individual operation and tens of kilobytes of preprocessing storage per operation per inference. We propose a replacement for garbled circuits: Tabula, an algorithm to securely and efficiently perform single operand nonlinear functions for private neural network inference. Tabula performs a one time client initialization procedure with the help of a trusted third party (or via using fully homomorphic encryption), operates over smaller finite fields whose elements are representable with less than 16 bits, and employs a lookup table which stores the encrypted results of nonlinear operations over secretly shared values. We show Tabula is secure under a semi-honest threat model, allowing it to be used as a replacement for garbled circuits operations. Our results show that for private neural network inference, Tabula eliminates communication by a factor of more than $50 \times$, enables speedups over $10 \times$, and reduces storage costs from $O(n)$ to $O(1)$.
One-sentence Summary: Reduce private neural network inference communication by over $50\times$ and runtime by over $10\times$ by using a lookup table to securely compute neural network nonlinear activation functions
Supplementary Material: zip
4 Replies

Loading