A Generic Hybrid 2PC Framework with Application to Private Inference of Unmodified Neural Networks (Extended Abstract)Download PDF

Published: 04 Nov 2021, Last Modified: 15 May 2023PRIML 2021 PosterReaders: Everyone
Keywords: secure two-party computation, hybrid circuits, neural networks, implementation
TL;DR: We present a new framework combining five secure two-party computation protocols for privately evaluating hybrid circuits and neural networks.
Abstract: We present a new framework for generic mixed-protocol secure two-party computation (2PC) and private evaluation of neural networks based on the recent MOTION framework (Braun et al., ePrint '20). We implement five different 2PC protocols in the semi-honest setting -- Yao's garbled circuits, arithmetic and Boolean variants of Goldreich-Micali-Wigderson (GMW), and two secret-sharing-based protocols from ABY2.0 (Patra et al., USENIX Security '21) -- together with 20 conversions among each other and new optimizations. We explore the feasibility of evaluating neural networks with 2PC without making modifications to their structure, and provide secure tensor data types and specialized building blocks for common tensor operations. By supporting the Open Neural Network Exchange (ONNX) file format, this yields an easy-to-use solution for privately evaluating neural networks, and is interoperable with industry-standard deep learning frameworks such as TensorFlow and PyTorch. By exploiting the networks' high-level structure and using common 2PC techniques, we obtain a performance that is comparable to that of recent, highly optimized works and significantly better than when using generic 2PC for low-level hybrid circuits.
Paper Under Submission: The paper is NOT under submission at NeurIPS
1 Reply

Loading