EncCluster: Bringing Functional Encryption in Federated Foundational Models

Published: 01 Oct 2024, Last Modified: 17 Oct 2024FL@FM-NeurIPS'24 OralEveryoneRevisionsBibTeXCC0 1.0
Keywords: Functional Encryption, Federated Learning, Foundational Models, Probabilistic Filters, Weight Clustering
TL;DR: Scalable Functional Encryption in Federated Learning
Abstract:

Federated Learning (FL) decentralizes model training by transmitting local model updates to a central server, yet it remains vulnerable to inference attacks during these transmissions. Existing solutions, such as Differential Privacy (DP) and Functional Encryption (FE), often degrade performance or impose significant operational burdens on clients. Meanwhile, the advent of Foundation Models (FMs) has transformed FL with their adaptability and high performance across diverse tasks. However, delivering strong privacy guarantees with these highly parameterized FMs in FL using existing privacy-preserving frameworks amplifies existing challenges and further complicates the efficiency-privacy trade-off. We present EncCluster, a novel method that integrates model compression through weight clustering with decentralized FE and privacy-enhancing data encoding using probabilistic filters to deliver strong privacy guarantees in FL without affecting model performance or adding unnecessary burdens to clients. We perform a comprehensive evaluation, spanning $4$ datasets and $5$ architectures, to demonstrate EncCluster scalability across encryption levels. Our findings reveal that EncCluster significantly reduces communication costs —below even conventional FedAvg— and accelerates encryption up to $1000\times$ over baselines; at the same time, it maintains high model accuracy and enhanced privacy assurances.

Submission Number: 13
Loading