Abstract: Private learning for vertical decision trees (PVDT) is an emerging paradigm that allows multiple parties to execute cooperative training and inference of decision trees on vertically partitioned datasets, without revealing either party's data or model. The state-of-the-art PVDT schemes employ the secret-sharing-based secure multi-party computation (MPC) to admit low computational cost and low bandwidth. Nevertheless, existing schemes need many communication rounds for computing concrete protocols in PVDT, like the less-than comparison, division, etc. This property is not suited for large-communication-latency networks such as WAN. In this work, we present a two-party PVDT framework, called Swan, to enable a secure, accurate, and fast realization of vertical decision trees. At the core of Swan, we design a secure and parallel protocol for $N$-input multiplication with one communication round. This forms the cornerstone for a series of secure and communication-efficient computation protocols specifically tailored to less-than comparison and division. Along the way, we use these optimized protocols to refine the training and inference processes of PVDT, achieving a significant reduction in both communication costs and rounds. Experimental results show Swan provides top-notch accuracy, and achieves a $10.2\times$ and $2.8\times$ improvement in online training and inference latency over WAN compared to prior art.
External IDs:dblp:journals/tdsc/SongCFCZCS25
Loading