DVFL: Decentralized Privacy-Preserving Vertical Federated Learning using Functional Encryption

14 Nov 2025 (modified: 01 Dec 2025)IEEE MiTA 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: vertical federated learning, privacy-preserving, decentralized multi-client functional encryption
Abstract: Federated learning, as a privacy-preserving distributed machine learning approach, has been proposed to enable multiple participants to collaboratively train a model without sharing their original training data. Most existing studies have focused on privacy protection for horizontal federated learning, while there is a significant demand for vertical federated learning in real-world scenarios. Under the premise of privacy protection, current vertical federated learning (VFL) methods struggle to balance model accuracy, training time, and communication overhead. Most of these methods either sacrifice model accuracy for higher training efficiency or compromise training time and communication overhead for higher model accuracy. The emerging functional encryption mechanism can better balance these two aspects while maintaining privacy preserving, but in many scenarios, it relies on the support of a completely trusted third-party authority (TPA). To address these issues, we propose DVFL, a vertical federated learning framework based on decentralized multi-client functional encryption. This framework operates without the need of TPA support, utilizes gradient descent as the model optimization method, and is compatible with any machine learning or deep learning models that take linear layer as input layer. Our experimental evaluation demonstrate that, compared to state-of-the-art privacy preserving VFL methods, DVFL achieves a slight improvement in model accuracy while reducing training time by 20\% to 71\% and communication overhead by 34\% to 56\%.
Submission Number: 48
Loading